Politecnico di Torino
Politecnico di Torino
   
Login  
en
Politecnico di Torino
Anno Accademico 2015/16
02RAUIU, 02RAUIH, 02RAUIV, 02RAUIW, 02RAUIY, 02RAUKG, 02RAUKI, 02RAURK, 02RAURL, 02RAURO, 02RAURP, 02RAURQ, 02RAURR, 02RAURU, 02RAURV, 02RAURW
Topics In Internet & Society Interdisciplinary Studies
Dottorato di ricerca in Ingegneria Informatica E Dei Sistemi - Torino
Dottorato di ricerca in Ingegneria Delle Strutture - Torino
Dottorato di ricerca in Energetica - Torino
Espandi...
Docente Qualifica Settore Lez Es Lab Anni incarico
De Martin Juan Carlos ORARIO RICEVIMENTO AC ING-INF/05 0 0 0 3
SSD CFU Attivita' formative Ambiti disciplinari
*** N/A ***    
Obiettivi dell'insegnamento
PERIODO: GIUGNO 2016

il corso sarà tenuto dai proff:
- Charles Nesson - Harvard University (USA)
- Brett Frischmann - Benjamin N. Cardozo School of Law, Yeshiva
University, New York
- Lucie Guibault - University of Amsterdam
- Jean-Claude Guédon - Universitè de Montreal

It is commonly understood that the Internet now affects almost every aspect of society–
from knowledge sharing to economic interactions – and of personal life (digital divide
permitting). Understanding the relationship between Internet and society is a complex
matter, since its shape and its evolution do not depend on immutable technological laws,
but instead are the consequence of specific choices, both private and public, that could –
at least in principle – very well change over time.
This course – addressed to students of all Doctoral Programmes – is aimed at providing
an interdisciplinary overview of a selection of Internet & Society topics currently addressed
by scholars at the global level, all of which have tangible implications in many domains,
and may very well suggest new lines of reasoning to ongoing research of PhD candidates.
Programma
These topics include, but are not limited to:
- knowledge creation and sharing in the digital environment, applied, e.g., to scientific
publications, addressing legal and social aspects of Open Access publishing, and to
software, discussing the principles and applications of Free Software;
- the role of the Internet in the evolution of journalism, civic hacking, and civic engagement;
- the evolution of the Internet architecture and protocols with respect to entrepreneurial
models, and the economic environment;
- the future of Internet & Society research, also taking into account ongoing technological
innovations such as the Internet of things, discussing their opportunities and risks.
All topics will be addressed from a interdisciplinary point of view (i.e., considering at least
two disciplines) and making a broad use of examples.

Schedule in brief
Thursday 23 June, 10:00-15:30, Sala Conferenze - Luigi Ciminiera
Monday 27 June, 8:15-18:00, Aula Formazione
Tuesday 28 June, 9:00-12:30, Aula Formazione
Wednesday 29 June, 15:00-16:30, Aula Formazione - Examination
Programme of the lectures

Thursday 23 June

10:00 - 11:30 Thoughts on Techno-Social Engineering of Humans and the Freedom to Be Off, Brett Frischmann, Yeshiva University (USA)

A lecture based on four chapters from my forthcoming book, Being Human in the 21st Century: How Social and Technological Tools are Reshaping Humanity (Cambridge 2017).

Many people complain that technologies dehumanize. It is difficult, however, to know when a line has been crossed, when the social engineering has gone too far, when something meaningful has been lost. Do we know when technology replaces or diminishes our humanity? Can we detect when this happens? To begin to answer these questions, we would have to know what constitutes our humanity, what makes us human in the first place. And that turns out to be an incredibly difficult question, one that has been debated without resolution for millennia. Given such difficulty at the most foundational level, it should not be surprising that we do not have a reliable method for identifying and evaluating technological impacts on our humanity. Of course, people often claim that technologies dehumanize, especially in recent decades with the widespread adoption of computers, the Internet, and more recently, smartphones. But the public generally takes such claims to be alarmist, and so the claims remain untested and ultimately drowned out by rampant enthusiasm for new technology. Yet techno-social engineering of humans exists on an unprecedented scale and scope, and it is only growing more pervasive as we embed networked sensors in our public and private spaces, our devices, our clothing and ourselves.

Being Human in the Twenty First Century moves beyond the impasse and engages these fundamental questions in a fresh, intellectually rigorous, and conceptually exhilarating manner. The book develops a method to examine the dynamic relationship between humans and the technologies we develop and use. At its core, the method depends on a fundamental and radical repurposing of the Turing test, which is the test that Alan Turing famously proposed to examine whether a machine can think. Turing’s test examined the line between humans and machines, focusing on the machine side of the line and giving rise to the field of artificial intelligence and machine learning. This book examines the human side of the line. It uses machines as a baseline and asks when and how humans behave in a machine-like manner. It begins with different types of intelligence, but it does not end there. There is more to humanity than that. The method is extended to different capacities, such as the capacity to relate to others, as well as the core concepts of free will and autonomy. The radically repurposed Turing tests are plausible tests to employ, but more importantly, the tests serve as conceptual tools that enable a deeper consideration of what makes us human and how our humanity is reflected in and affected by the technologies we develop and use. Throughout the book, the concepts are grounded with familiar examples, such as call centers, the assembly line, public schools, online electronic contracting, wearable technologies, social media, and many others. The last two chapters directly focus on interconnected sensor networks, the Internet of Things, and (big) data enabled automation of systems around, about, on and in human beings.




14:00 - 15:30 Transparency the new objectivity, Anna Masera, La Stampa (Italy)

There is a crisis of trust in media and David Weinberger’s premonition "transparency is the new objectivity" (2009) has become a recipe out of this crisis. Weinberger was referring to the fact that objectivity – even if unattainable - served an important role in how we came to trust information, and in the economics of newspapers in the modern age. The problem with objectivity is that it tries to show what the world looks like from no particular point of view, which is like wondering what something looks like in the dark. With transparency as the new objectivity, what we used to believe because we thought the author was objective we now believe because we can see through the author’s writings to the sources and values that brought the author to that position. Transparency gives the reader information by which he/she can undo some of the unintended effects of the ever-present biases. Transparency brings us to reliability the way objectivity used to. This change is epochal...



Monday 27 June

8:15 - 9:45 Private spaces, public science? Open access and academic social
media, Maria Chiara Pievatolo, Università di Pisa (Italy)

Proprietary social media like Academia.edu and ResearchGate allow researchers to share papers and to connect to each other. They are, however, commercial services providing walled gardens, secluded from the open web and unable to ensure long term archiving and preservation. But their very success shows that they do answer - and monetize - a need for connecting researchers and sharing ideas that is apparently unfulfilled both by our legacy scientific publishing system and by the bulk of open access repositories. It could be useful to deal with a couple of questions: 1. from a theoretical point of view, why is our current public use of reason in such a predicament? 2. from a practical point of view, what can (young) researchers do about it?




10:00 - 11:30 Text and Data Mining: Barriers, Paths and Passable Roads, Lucie Guibault, Institute for Information Law of the University of Amsterdam (Netherlands)

This lecture presents the preliminary results of ongoing research for the EU Horizon2020 FutureTDM project on the legal barriers to TDM in Europe. It describes the objects of protection of copyright, database and data protection law respectively and the rights and obligations that are triggered when such materials or data are mined. Exceptions under copyright and database law are generally implemented in a fragmentary way, affecting TDM activities with cross-border aspects. Also, concepts such as ‘lawful user’, ‘non-commercial’ and ‘research’ leave room to interpretation, resulting in uncertainty as to the scope of these exceptions. Under data protection law, we find that the rules are very restrictive to TDM, and that exceptions do not provide much leeway as they are limited in scope and vary widely in their national implementations. The policies were assessed as to what extent they permit or promote TDM, in particular on the following aspects: beneficiaries of the policy, object (category of materials, contents or data) covered under permitted use, and the uses permitted by the policies. It discusses the policies of different categories of stakeholders and the merits of Open Access (OA) policies in general and for TDM in particular. We find that funders of research, governments, and research institutions and libraries generally advocate and apply OA; while OA policies to scientific publications are common and well-developed, OA policies regarding research data are still in their infancy, due to challenges as to confidentiality of information and privacy. Where non-OA publishers permit TDM carried out on their collections of publications, this is commonly restricted to academic and non-commercial users and purposes.




11:45 - 13:15 Attention and diversity in a many-to-many world, Philippe Aigrain, Computer scientist and author (France)




14:30 - 16:00 , Jean-Claude Guédon, University of Montreal (Canada)




16:15 - 17:45 Constructive Use of Online Technology in the Classroom: a Harvard Law School Experience, Charles Nesson, Berkman Center for Internet & Society, Harvard University (USA)

I will speak about the challenge of diversity in a socratic classroom. I will tell the story of Harvard Law School retiring its Royall shield in response to demands from Reclaim. I will explain the depressing effect of social conflict on classroom discourse and introduce you to an online modality that allows students in my class to engage in a synchronous pseudonymous threaded textual discussion in response to a prompt. This addition to the rhetorical modalities of the classroom facilitates forthright response from all students. Their articulation and sharing of ideas and responses on line invigorates ensuing face-to-face discussion in class.



Tuesday 28 June

11:00 - 12:00 The Internet of Things as a Way of Life, Bruce Sterling, Science-fiction writer (USA), Jasmina Tešanovic , Activist and writer (Serbia)

Bruce Sterling will describe the developing technical trends in the Internet of Things, including vocal computing and single-use wireless buttons. Jasmina Tesanovic of "Casa Jasmina" will talk about the "Internet of Women Things and open source projects in the experimental domestic space for the Torino Fab Lab.




The Scientific Method in the era of Big (Open) Data, Antonio Vetrò, Director of Research, Nexa Center (Italy)

Different forces are shaping the way and the goal of Science in the new century. The Internet and its related technologies, combined with the philosophical postulate of openness in Science, is one of them: Internet increases and extends the openness of Science in new ways, from enabling open access to large amount of resources to foster collaborations based on Web2.0 technologies. In such a context, in 2007 a well-know magazine have even predicted that the Big Data paradigm would make the scientific method obsolete: with the availability of huge amount of data and supercomputing, the traditional, hypothesis-driven scientific method would become obsolete; sophisticated algorithms and statistical tools to delve into massive amount of data will be automatically turned into knowledge. This lecture will give an overview of the new paradigms of the so called "Science 2.0" revolution, investigating the phenomenon from multiple perspectives (philosophical, statistical, normative) and giving the basis for critical reflections in the matter, with a focus on the role of data.

Wednesday 29 June

15:00 - 16:30 Examination
Orario delle lezioni
Statistiche superamento esami

Programma provvisorio per l'A.A.2015/16
Indietro



© Politecnico di Torino
Corso Duca degli Abruzzi, 24 - 10129 Torino, ITALY
WCAG 2.0 (Level AA)
m@il