Microsoft word - next issue | dichtung digital

Dichtung Digital
A Journal of art and culture in digital media
Enslaved by digital technology
Interview with Mihai Nadin

By Mihai Nadin and Roberto Simanowski
No. 43 – 2013-11-16
1. Prelude
Roberto Simanowski (RS):
What is your favored neologism of digital media culture and why?
Mihai Nadin (MN): ”Followed”/”Follower”: It fully expresses how the past overtook the
present. “Crowd” anything: self-delusional slogans for the daisy brain.
RS: If you could go back in the history of new media and digital culture in order to prevent
something from happening or somebody from doing something, what or who would it be?
MN: I would eliminate any word that starts with “hyper”, and every scoring facility.
Alternatively, I would prevent Bill Gates from developing DOS, and Apple from giving up on its
language (the IOS7 cries out as an example of failing to live up to the company’s foundations).
Yes, I would eliminate Term Coord, the European Union’s attempt to standardize terminology.
More important: I would establish a framework for reciprocal responsibility. No company
should be immunized against liability procedures. If your product causes damage due to
sloppy design, insufficient testing, perpetuation of known defects, you are liable. Forget the
legal disclaimers that protect disruptive technologies that disrupt our lives. And no user
should be allowed to further the disruption. A simple analogy: Carmakers are liable for
anything that systematically leads to accidents; drivers are liable for using cars irresponsibly.
Does the analogy of the technology of the industrial age extend to that of the digital age? On
an ethical level, of course. Innovation does not legitimize discarding ethics.
RS: What comes to mind if you hear “Digital Media Studies”?
MN: Opportunism. The unwillingness to think about a totally different age.
RS: If you were a minister of education, what would you do about media literacy?
MN: I would introduce “Literacies” (corresponding to all senses and to cognitive abilities) as
the ubiquitous foundation of everyone’s education. “Vive la différence” would be the common
denominator.
2. Politics and Government
RS
: Web 2.0 culture seems to have tamed and commodified the wild, anarchistic Internet of
the 1990s when people played with identity in IRCs and MUDs and built their own websites in
idiosyncratic ways. Today, clarity and transparency are the dominating values, and for
obvious reasons, since only true and honest information are valid data in the context of
commerce. This shift has also changed the role of the government. While in the 1990s
Internet pioneers such as John Perry Barlow declared the independence of Cyberspace from
the governments of the old world, now it seems people hope for governments to intervene in
the taking-over and commercialization of the Internet by huge corporations such as Google
and Facebook. Thus, web activists calling for the government to pass laws to protect privacy
online, and politicians suggesting expiration dates for data on social networks appear to be
activist in a battle for the rights of the individual. Have tables turned to that extent? Are we,
once rejecting old government, now appealing to it for help?
MN: Pioneers are always mercenaries. Of course, to open a new path is a daring act – so
much can go wrong. There is a lot of romanticism in what the Internet forerunners were
saying. Most of the time, their words were far louder than their accomplishments were
meaningful or significant. Declaring the Internet as an expression of independence from the
government when you are actually captive to DARPA is comical at best. MILNET (split from
ARPANet), further morphed into classified and non-classified Internet Protocol Router
Networks, should have warned us all about what we will eventually surrender. Was Minitel
(France, 1978) better? It offered little functionality, and was not dependent on the private data
of its users. DOS – the operating system that even in our days underlies the world of PCs
(since 1981) – was adopted without any consideration for the integrity of the individual. Apple
stole from Xerox something that, even today, the company does not fully understand. But
Xerox does data management in our days (it just took over the tollways in Texas), and Apple
sells music and whatnot – sometimes in collusion with publishers. You have to keep the
competition vigilant.In the early years, everybody was in a hurry. This was the second coming
of the California Gold Rush in which college dropouts found opportunity. Indeed, in no
university, nobody –academic or not – knew enough about the future that the pioneers were
promising to turn into paradise on Earth. When the blind lead the blind, you will never know
when you arrive, because you really don’t know where you are going.
RS: This is a strong, devastating statement: The pioneers of digital media and culture as
mercenaries, comics, thieves, dropouts, and blind persons without ideas and beliefs?
MN: Without idealizing the past or demonizing the beginners, let’s take note of the fact that
Lullus understood that with new means of expression we can better understand the universe.
And we can ask more interesting questions about the human being and its own
understanding of the world. Pascal would not miss the value of feelings in the human
perception of reality, and in the attempt to subject it to calculations. Leibniz, with whose name
computation is associated, would seek no less than a universal language for making possible,
for example, the understanding of history from a perspective of accomplishments. He was not
interested in translating Chinese philosophy word-by-word. He was interested in ideas. (If you
want to ask “What’s that?” this interview is not for you!)
College dropouts should not be vilified, but also not idealized. It helps to start something free
of the constraints of cultural conventions. It does not help to realize that what is at stake is not
a circuit board, a communication protocol, or a new piece of software, but the human
condition. The spectacular success of those whom we associate with the beginnings lies in
monetizing opportunities. They found gold! The spectacular failure lies in the emergence of
individuals who accept a level of dependence on technology that is pitiful. This dependence
explains why, instead of liberating the human being, digital technology has enslaved everyone
– including those who might never touch a keyboard or look at a monitor. To complain about
the lack of privacy is at best disingenuous. Those who rushed into the digital age gave it up!
In Web 2.0, profits were made not by producing anything, but in profiling everyone. The
nouvelle vague activism of our days is a mantra for legitimizing new profitable transactions,
not a form of resistance. If everyone really cared for their rights, we would have them back. All
that everyone really wants is a bigger piece of the pie (while starting the nth diet).
RS: I completely agree that the implications of digital culture also affect those staying away
from digital media, if such staying away is possible at all. But how guilty are those giving up
privacy, their own and as a concept in general, by rushing into the digital age? Considering
McLuhan’s convictions that first we shape our tools and afterwards our tools shape us and
that any medium has the power to impose its own assumptions on the unwary, I wonder how
deliberate the acceptance of the new media’s assumptions are. Hand in hand with the human
being’s unwariness goes triumph as homo faber, which Hans Jonas, in his book The
Imperative of Responsibility: In Search of an Ethics for the Technological Age
, calls the human
fatality. We are entrapped by our success, Jonas states, in respect to the human’s belief in
technology. Our power over nature has become self-acting and made man into a “compulsive
executer of his capacity.” What would be required now is a power over that power. Did we
really expect Gold Rush entrepreneurs to develop this kind of self-discipline?
MN: The echo chamber metaphor was used so far mainly to describe politics. It simply says
that feedback of a narcissistic nature reinforces prejudices. Under Hitler, Stalin, Mao, and
current Islamic extremism, masses tend towards hysterics. Self-induced delusions and
political idolatry are twins.
Does it look any different within the objective, rational domain of science and technology?
The expectation of objectivity is sometimes rewarded: there are scientific and technological
developments of authentic novelty. But let’s be clear: revolution means to turn things around,
full circle, and in this respect, the information age is such a development. Technologically, this
is a time of amazement. Conceptually, it is rather the reinvention of the wheel in digital format.
For a long time, no new idea has percolated. The innovators aligned themselves with those in
power and those with money. When the profit potential of the typewriter – the front end of
IBM computers in the attempt to be free of perforated cards – was exhausted, word
processing emerged. The X-acto knife gave way to the cut-and-paste procedure. It was not a
new way of thinking, but rather a continuation of old patterns.
I am deeply convinced that computation (not only in its digital format) will eventually open up
new opportunities and break from the past. The self-discipline in your question – how to keep
a lid on the obsession with profit at any price – should actually become the determination to
give free rein to creativity. Under the pressure of profit-making, there is no authentic freedom.
In the echo chamber of science, celebration of one adopted perspective – the deterministic
machine – leads to the automatic rejection of any alternative.
RS: Big Data is the buzzword of our time and the title of many articles and books, such as Big
DatMN: A Revolution That Will Transform How We Live
by Viktor Meyer-Schönberger and
Kenneth Cukler (2013). The embracing response to the digitization and datafication of
everything is Data Love, as the 2011 title of the conference series NEXT reads, which informs
the business world about “how the consumer on the Internet will be evolving.” It is a well-
known fact that Big Data Mining undermines privacy. Is, however, that love mutual, given the
acceptance or and even cooperation of most of the people?
MN: Big data represents the ultimate surrender to the technology of brute force. Wars are big
data endeavors, so are the economic wars, not to mention the obsession with power and total
control of the so-called “free individual.” Whether we like it or not, “information society”
remains the closest description of the age of computers, networks, smartphones, sensors,
and everything else that shapes life and work today. We are leaving behind huge amounts of
data – some significant, some insignificant. Babbage’s machine, like the first recording
devices, like the abacus and so many pneumatic and hydraulic contraptions, are of
documentary importance. I am sure that if entrepreneurs of our days could find any value in
them, they would not hesitate to make them their own and add them to the IP portfolio of their
new ventures. What cannot be monetized is the human condition expressed in such previous
accomplishments. You cannot resuscitate Babbage or Peirce, except maybe for some
Hollywood production or some new game.
Data becomes information only when it is associated with meaning. However, our age is one
of unreflected data generation, not one of quest for meaning. Data production (“Give me the
numbers!”) is the new religion. Politics, economics, and science are all reduced to data
production. Ownership of data replaced ownership of land, tools, and machines. Human
interaction is also reduced to data production: what we buy, where we buy, whom we talk to,
for how long, how often, etc. The Internet as the conduit for data is boring and deceiving. This
is not what Vinton Cerf, to whose name the global transmission protocol TCP/IP is attached,
had in mind. Instead of becoming a medium for interaction, the Internet got stuck in the
model of pipes (sewage pipes, oil pipes, water pipes, and gas distribution pipes) and pumps
(servers being engines that pump data from one place to another). Berners-Lee’s world-wide
web made it easier to become part of the network: the browser is the peephole through which
anyone can peek and everyone’s eyeballs become a commodity. Great pronouncements will
not change this reality more than radical criticism (sometimes, I confess, a bit exaggerated).
But we should at least know what we are referring to.
By the way: creative work – of artists, scientists, craftsmen (and women) – takes place on
account of sparse data. Survival is a matter of minimal data, but of relevant information.
RS: Again, your account is quite radical and disillusioning, though not unjustified. In response,
let me ask to what extent the browser reduces people to commodified eyeballs. Hasn’t the
Internet (or the Web 2.0) rather turned every viewer and listener into a potential sender thus
weaving a network of “wreaders”, as George P. Landow termed the reader-authors in
hypertext in 1994, or “prosumers”, as the corresponding Web 2.0 concept reads? Isn’t this
the spread of the microphone, to allude to Bertolt Brecht’s demand in his radio essays 85
years ago? Isn’t this dogma of interaction the actual problem?
MN: In the evolution from centralized computing (the “big iron” of the not so remote past) to
workstations, to client server architecture, to the Cloud (real-time) re-centralization, we have
not come close to establishing the premise for a human knowledge project. “The knowledge
economy” is a slogan more than anything else. Computation made possible the replacement
of living knowledge by automated procedures. However, most of the time, computation has
remained in its syntax-dominated infancy. On a few occasions, it started to expand into the
semantic space. The time for reaching the pragmatic level of authentic interactions has not
yet come. If and when it comes, we will end the infancy stage of computation. “Eyeballs” are
not for interaction in meaningful activities, but rather for enticing consumers. Interaction
engages more than what we see.
RS: One basic tool of data accumulation and mining is Google, which through every search
query not only learns more about what people want and how society works, but also
centralizes and controls knowledge through projects such as Google Books. How do you see
this development?
MN: There is no tragedy in digitizing all the world’s books, or making a library of all music, all
movies, etc. After all, we want to gain access to them. This is their reason for being: to be
read, listened to, experienced. The tragedy begins when the only reason for doing so is to
monetize our desire to know and to do something with that knowledge. I remember shaking
hands with that young fellow to whom Terry Winograd introduced me (May 1999). Larry Page
was totally enthusiastic upon hearing from me about something called “semiotics”. At that
time (let me repeat, 1999) none of my friends knew what Google was, and even less how it
worked. They knew of Mosaic (later Netscape Navigator), of the browser wars, even of
AltaVista, Gopher, and Lycos (some survived until recently). Today, none can avoid
“Googling”. (Lucky us, we don’t have to “Yahoo!”) The act of searching is the beginning of
pragmatics. Yes, we search in the first place because we want to do something (not only find
quotes). Pragmatics is “doing” something, and in the process, recruiting resources related to
the purpose pursued. Larry Page is one of the many billionaires who deserve to be celebrated
for opening new avenues through searches that are based on the intuitive notion that
convergence (of interest) can be used in order to find out what is relevant. But nobody will tell
him – as no one will tell Raymond Kurzweil – that the real challenge has yet to be addressed:
to provide the pragmatic dimension. The fact that Google “knows” when the flu season starts
(check out searches related to flu) is good. But if you used this knowledge only for selling ads,
you miss the opportunity to trigger meaningful activities. Seeking life everlasting is not really a
Google endeavor. It is a passion for which many people (some smart, some half-witted) are
willing to spend part of their fortunes. They can do what they want with their money. Period!
But maybe somebody should tell them that it makes more sense to initiate a course of action
focused on the betterment of the human condition. Or at least (if betterment sounds too
socialist) for more awareness, higher sense of responsibility. Properly conceived, Facebook
(or any of the many similar attempts) could have been one possible learning environment, way
better adapted to education than the new fashionable MOOCs [massive open online courses].
Instead, it is an agent of a new form of addiction and human debasement.
3. Algorithms and Censorship
RS
: The numbers of views, likes, comments, and the Klout Score – as measure of one’s
influence in social media – indicate the social extension of the technical paradigm of digital
mediMN: counting. The quantification of evaluation only seems to fulfill the cultural logic of
computation, the dichotomy of like/dislike even to mirror the binary of its operational system.
The desired effect of counting is comparison and ranking, i.e., the end of postmodern
ambiguity and relativism. Does the trust in numbers in digital media bring about the
technological solution to a philosophical problem? A Hollywood-like shift from the
melancholia of the end of grand narratives and truth to the excitement of who or what wins
the competition?
MN: “Scoring” is the American obsession of the rather insecure beginnings projected upon
the whole world. In the meaningless universe of scoring, your phone call with a provider is
followed by the automatic message eliciting a score: “How did we do?” Less than 2% of
users fall into the trap. The vast majority scores only when the experience was extremely bad
– and way too often, it is bad. This is one example of how, in the age of communication
(telling someone how to improve, for example), there is no communication: the score data is
machine processed. The best we can do when we want to achieve something is to talk to a
machine. The one-way only channels have replaced the two-way dialog that was meant to be
the great opportunity of the digital age. We don’t want to pay for talking with a human being.
In reality, we don’t want to pay for anything. As the scale expands, everything becomes
cheap, but nothing has become really better. To speak about influence in social media, for
example, is to be self-delusional. Scoring is not only inconsequential, but also meaningless.
Indeed, in this broad context, the human condition changes to the extent that the notion of
responsibility vanishes. Consequences associated with our direct acts and decisions are by
now projected at the end of a long chain of subsequent steps. At best, we only trigger
processes: “Let’s warm up a frozen pizza”. You press a button; the rest is no longer your
doing in any form or shape. More than ever before has the human being been rendered a
captive receiver under the promise of being empowered. The variety of offerings has
expanded to the extent that, instead of informed choices, we are left with the randomness of
the instant. As a matter of fact, the “living” in the living is neutralized. The age of machines is
making us behave more like machines than machines themselves. The excitement and energy
of anticipation are replaced by quasi-instinctual reactions. By no means do I like to suggest
an image of the end of humanity, or of humanness. There is so much to this age of
information that one can only expect and predict the better. For the better to happen, we
should realize that dependence on technology is not the same as empowerment through
technology. The secular “Church of Computation” (as yet another Church of Machines) is at
best an expression of ignorance. If you experience quantum computation, genetic
computation, intelligent agents, or massive neural networks, you realize how limiting the
deterministic view of the Turing machine is. And you learn something else: There is a price to
everything we want or feel entitled to.
RS: During the debate of the NSA scandal in summer 2013, Evgeny Morozov entitled an
essay in the German newspaper Frankfurter Allgemeine Zeitung The Price of Hypocrisy,
holding that not only the secret service or the government undermines the privacy of the
citizen, but the citizens themselves by participating in theinformation consumerism. Morozov
points to the Internet of things that will require even more private information in order to work,
i.e., to facilitate and automate processes in everyday life that until now we had to take upon
ourselves. The price for this kind of extension of man’s brain is its deterioration through the
loss of use. On the other hand, some claim that if the swimming pool heats up automatically
after discovering a BBQ scheduled in our calendar, our brains are freed up for more important
things.
MN: Yes, we want to automate everything – under the illusion that this will free us from being
responsible for our own lives. For those shocked by the revelation that there is no privacy on
the Internet of data, I can only say: This is a good measure of your level of ignorance, and
acceptance of a condition in which we surrender to the system. Walk into Taste Tea in San
Francisco, where the credit card app “Square” registers your iPhone presence and logs into
your account. This is not government intrusion, but the convenience of automated payment.
We cannot have it both ways – privacy and no privacy at all. Fundamentally, the age of
information is by definition an age of total transparency. That Internet imaginaire that some
pioneers crowed about – it will make us all more creative, freer than ever, more concerned
about each other – was only in their heads. And not even there. The “innocents” were already
in bed with the government – and with the big money.
It is not the government that betrayed the Internet. The “innocents” volunteered back doors
as they became the world’s largest contracting workforce for spy agencies. The hotness IQ
ranking for university studies in our days (cybersecurity, anyone? data-mining?) reflects the
situation described above: “Follow the money!” Total transparency is difficult. A new human
condition that accepts total transparency will not miraculously emerge, neither in San
Francisco, nor in India, China, or Singapore. Government will have to be transparent. Who is
prepared for this quantum leap? The government could make it clear: We observe you all (and
they do, regardless of whether they make it known or not). Those hiding something will try to
outsmart the system. The rest will probably be entitled to ask the Government: Since you are
keeping track of everything, why not provide a service? My files are lost, you have them,
provide help when I need it. We pay for being observed, why not get something in return?
RS: The Internet of things leads to a central topic of your work during the last decade. In your
article “Antecapere ergo sum: what price knowledge” you foresee a rather bleak future in
which responsibility is transferred from humans to machines by calculation and algorithmic
datamining. You also speak of a new Faustian deal where Faust conjures the Universal
Computer: “I am willing to give up better Judgment for the Calculation that will make the
future the present of all my wishes and desires fulfilled.” How do anticipation, computation,
Goethe’s Faust and Descartes’ ergo sum relate to each other?
MN: In order to understand the profound consequences of the Information revolution, one has
to juxtapose the characteristics of previous pragmatic frameworks. I did this in my book, The
Civilization of Illiteracy
(a work begun in 1981 and published in 1998), available for free
download on the Internet. (Try my site, or any Gutenberg Project site.) There are books that
age fast (almost before publication); others that age well, and others waiting for reality to
catch up. This book is more than ever the book of our time. I don’t want to rehash ideas from
the book, but I’d like to make as many people as possible aware of the fact that we are
transitioning from a pragmatics of centralism, hierarchy, sequentiality, and linearity to a
framework in which configuration, distribution, parallelism, and non-linearity become
necessary. The theocracy of determinism (cause→effect) gives way to non-determinism
(cause→effect←cause). It is not an easy process because, for a long time we (in western
civilization, at least) have been shaped by views of a deterministic nature.
To understand the transition, we must get our hands dirty in pulling things apart: pretty much
like children trying to figure out how toys work. Well, some of those toys are no longer the
cars and trains that my generation broke to pieces, convinced that what made them run was
hidden down there, in the screws and gears forming part of their physical makeup. Search
engines, algorithms, and rankings – the new toys of our time – are only epiphenomenal
aspects. At this moment, nobody can stop people from Googling (or from tearing apart the
code behind Google), and even less from believing that what the search comes up with is
what they are looking for.
We rarely, if ever, learn from the success of a bigger machine, a larger database, a more
functional robot, or a more engaging game. We usually learn from breakdowns. It is in this
respect that any medium becomes social to the extent that it is “socialized”. The so-called
“social media” are top-down phenomena. None is the outcome of social phenomena
characteristic of what we know as “revolutions” (scientific, technological, political, economic,
etc.). They are the victory of “We can” over “What do we want?” or “Why?” And as usual, I go
for questions instead of appropriating the slogans of others.
RS: As a remark on how we capitulate to our capabilities: During the anti-NSA protests in
summer 2013, somebody presented a poster stating “Yes we scan”. This of course alluded to
the famous slogan in Obama’s election campaign, articulating disappointment in the new
president and perhaps also calling for a new movement. Read together, both slogans
symbolize the law at least of the technological part of society: We scan because we can.
MN: Without accepting even a hint of a dark plot, we need to understand what is called
“social media” as an outcome of the transaction economy. This replaces the industrial model,
even the post-industrial model. To stick to the toy metaphor: Someone decided that we are all
entitled to our little cars, fire engines, and trucks. We get them because they are deemed
good for us. And before we even start being curious, the next batch replaces what we just
started to examine. It is no longer our time, as inquisitive children, that counts. Others
prescribe the rhythm for our inquisitive instincts. And for this they redistribute wealth. In rich
and poor countries, phones are given away. You need to keep the automated machines busy.
Money is not made on the phones but on the transmission of data. This is a new age in the
evolution of humankind. Its definitory entity is the transaction, carried out with the expectation
of faster cycles of change, but not because we are smarter and less inert; rather because our
existence depends on consuming more of everything.
The faster things move around, the faster the cycle of producing for the sake of consumption.
Each cycle is motivated by profit-making. The huge server farms – the toys of those
controlling our economic or political identity – are really not at all different from the financial
transaction engines. Nothing is produced. A continuous wager on the most primitive instincts
is all that happens. Thousands of followers post sexually explicit messages and invitations to
mob activity; they trade in gossip and selling illusions. Ignorance sells better and more easily
than anything else. Not only copper bracelets, cheap Viagra, diet pills, and everything else
that succeeds in a large-scale market. If you Google, you first get what those who have paid
for it want you to see, sometimes to the detriment of other (maybe better) options. Fake
crowds are engineered for those living in the delusion of crowd sourcing.
The transaction economy, with all its high-risk speculation, is the brain-child of Silicon Valley.
San Francisco is far more powerful than Washington DC, New York, and even Hollywood.
Chamat Palihapitiya put it bluntly: We’re in this really interesting shift. The center of power is
here, make no mistake. I think we’ve known it now for probably four or five years. But it’s
becoming excruciatingly, obviously clear to everyone else that where value is created is no
longer in New York, it’s no longer in Washington, it’s no longer in LA. It’s in San Francisco and
the Bay Area. (Palihapitiya is one among many bigwigs going public on such a subject.)
RS: Adorno once accused the culture industry of liberating people from thinking as negation,
as addressing the status quo. Being busy learning the status quo, i.e., finding out how all the
new toys work – politically upgraded and camouflaged by euphemistic concepts such as
“social” and “interactive” – seems to be a clever strategy to achieve the same result. Your
new book, Are You Stupid? describes stupidity as the outcome of a system faking change
because it is afraid of it. Who rules that system? Who is behind that strategy?
MN: Let us be very clear: the revolutionary technology that was seen as liberating in so many
ways actually became the underlying foundation of the transaction economy. Never before
has the public been forced into the rental economy model as much as the digital revolution
has done. You no longer own what you buy, but rent the usage. And even that is not
straightforward. It has become impossible to connect to the Internet without being forced into
a new version or a new patch. It has all gone so far that to buy a cell phone means to become
captive to a provider. In the USA, a law had to be promulgated in order to allow a person to
unlock “your” phone! (Remember: this is the age of the rent economy, of transactions not
production.) Profits have grown exponentially; service never lives up to promises made and to
the shamelessly high prices charged. In this context, social media has become not an
opportunity for diversity and resistance, but rather a background for conformity. Once upon a
time, within the office model, it was not unusual that women working together noticed that
their menstrual periods synchronized. Check out the “Friends” on various social mediMN:
They now all “think” the same way, or have the same opinion. That is, they align to fashion
and trends, they have their “period” synchronized. All at the lowest common denominator.
In making reference to such aspects of the “social” media, I might sound more critical than
their investors would prefer. But as long as we continue to idealize a technology of
disenfranchisement and impotence, we will not overcome the limitations of obsession with
data to the detriment of information. The toy train reduced to meaningless pieces entirely lost
its meaning. Remember trying to make it move as it did before curiosity took the better of it?
Information eventually grew from playing with toys: the realization that things belong together,
that the wheel has a special function, etc. Given the fact that in the digital embodiment
knowledge is actually hidden, replaced by data, the human condition that results is one of
dependence. There is no citizenry in the obsession with the newest gadget, bought on credit
and discarded as soon as the next version makes the headlines. The Netizen that we
dreamed of is more a sucker than an agent of change.
RS: How do you see the role that search engines such as Google play in society?
MN: In everything individuals do, they influence the world – and are influenced by the world.
Within an authentic democracy, this is an authentic two-way street: you elect and you can be
elected. Google, or any other search engine for that matter, reflects the skewed relation
between individuals and reality. Some own more than the other(s). This ownership is not just
economic. It can take many other forms. If you search for the same word on various sites and
at various times, the return will be different. In the naïve phase of data searching, way back in
the 90s of the last century, relevance counted most. In the transaction economy, search itself
is monetized: many businesses offer SEO functions. It pays to “find” data associated with
higher rewards. Such rewards are advertisement, political recognition, technical ability, etc. In
other words, through the search, a cognitive economic, political, etc. reality is engineered as
the “engine” forces the searcher to receive it.
Of course, social media relies on search engines, because instead of empowering
participants, it engineers the nature of their relations. This remark is not meant to demonize
anyone. Rather, it is to establish the fact that in post-industrial capitalism, profit-making is
accelerated as a condition for economic success. Those who do not keep up with the speed
of fast transactions turn into the stories of wasted venture capital and failed start-ups. The
cemetery of failed attempts to work for the common good is rarely visited. America, but to a
certain extent Germany, England, and France, sucks up talent from the rest of the world,
instead of rethinking education for the new context of life and work in the information society.
When the world learned about the worrisome depth at which privacy was emptied of any
meaning, by government and business, it was difficult to distinguish between admiration and
uproar.
The fascinating Silicon Valley ecology deserves better than unreserved admiration. It is time to
debunk the mythology of self-made millionaires and billionaires – and even more the aura of
foundations à la Gates, which are mostly self-serving. America has encouraged the rush to
the new gold not because it loves the new science and technology, but rather because it
recognized new forms of profit-making. Unfortunately, the human condition associated with
the information society continues to be ignored. At the scale at which profits are multiplying,
crime is also multiplying.
4. Art and Aesthetics
RS
: The use of digital technology in making art has created a new term: digital art. How do
you see art in the age of its digital production?
MN: There is no such thing as “digital art” just as there is no “digital humanities”. Something
qualifies as art (or humanities) or it does not. All the formulations that have emerged parallel to
increased computer use are an attempt to signal keeping pace with science and technology;
but fundamentally they do nothing to define new forms of writing and composing, new types
of performance. Technology does not make anyone more talented (and even less more
humane). Yes: the MMOGs [massively multiplayer online games] were not possible before,
and the newest SPOCs [self-paced open courses] – the alternative to MOOCs [massive online
open courseware] – even less. But are they meaningful? Yes again: very many aesthetic
experiments became possible only on account of new forms of interaction. Virtual reality is a
new medium. There is a lot of work going on in the direction of capturing emotional aspects in
some digital format. Many forms of human expression (gestures, facial expression, eye
movement, etc.) are successfully integrated in syncretic means of communication. Reading
and writing changed. So did drawing, painting, and sculpture. 3D printing expanded aesthetic
expression. And all this on account of a technology that is still in its primitive phase. However,
technology on its own does not contribute values. It makes them possible – as it makes
possible irrational and destructive behavior.
RS: However, isn’t this reliance on digital technology also an essential factor in the process of
production and perception of that art, thus justifying its definition as digital, i.e., art born in
digital media? Does not the nature of the medium – digital technology – bestow specific
characteristics upon the artwork: the need to carefully program (which reestablishes the
central role of craft), the possibility to respond (which leans towards interactions between
artwork and audience), the link to the Internet (which allows the artwork to present content
foreign even to the artist herself)?
MN: The emptiness of the singularity prophecy (of the same condition as healing death) is the
result of confusing outcome with process. The only legitimacy of art is that of being process.
If all the sculptures of antiquity were to disappear tomorrow, if all the Grecian temples were to
disintegrate, if the great monuments of India and Persia, the artifacts of China and Russia
were to vanish, the loss would probably be less significant than the loss caused by a fire that
devastated all the world’s forests. The understanding of art as process means that the human
condition shaped by those who made the works is what actually counts. Ideas, new ways of
looking at things, of hearing and seeing, change the human being. Let’s recall an idea from
tenor John Potter: disable all forms of digital storage and reproduction in order to return to
music as a living experience. Is he a new Luddite? By no means. Under the pressure of the
technology, we gave up almost every spontaneous human expression. Interaction was
streamlined. The fact that technology does not have to result in such a flattening of the
human condition is more an expression of hope than a description of reality. Whether we like
it or not, the live experience of music, of theater, of dance, or of anything that can be digitally
stored and reproduced, belongs to the past. No matter how much some of us are still
convinced that it was a richer, more meaningful experience, its necessity is transcended by
the drive for immediacy and convenience.
Not every aesthetic experience in this age of tumultuous technological disruptions is
significant. Glorifying tyrants or appealing to the lowest instincts (sexism, racism, nationalism,
celebrity worship, etc.) also affects the human condition. But probably not in the sense of
elevating it. The art associated with the new forms of scientific or technological expression
produce, mostly, new profit-making machines, and the vain hope that you don’t need talent to
be an artist – the machine will do it. David Hockney’s “finger paintings” on the iPhone created
an industry, as much as the industrious pop-artists did a few years back (while painting a
swimming pool in Los Angeles).
Once again: such remarks might confuse those who know me and my work as the expression
of unrestrained optimism. Opportunities keep multiplying. Harold Cohen is an example I bring
up with pleasure because in his attempt to integrate artificial intelligence in the making of art,
he had to reflect upon the fundamental questions of “What is art?” and “Why do we make
art?” My hope is that the Why question will overtake the infantile How question that
dominates aesthetic activity in our days.
5. Media Literacy
RS
: Let me come back to your book The Civilization of Illiteracy where you depict a civilization
unfolding in which media complement literacy, and literacy – the way it is conceptualized in
the Gutenberg Galaxy – is undermined by new literacies demanded and developed by digital
technology. The general tone of the book is one of excitement and the invitation to be ready
for the new challenges. Your answers in this interview so far indicate that this has changed.
MN: Fifteen years after The Civilization of Illiteracy was published (and almost 30 years since I
started writing it), I cannot be more optimistic than I was at the time it was published. I already
mentioned that I am convinced that it is the book of our time: new developments are still
catching up with some of its predictions. It is an 890-page book, which I thought would be the
last book of the civilization of literacy. If anyone would care to edit it, I will have no problem in
co-signing a streamlined edition. But these remarks are not about how I feel. The book would
probably best serve the needs of current readers in an interactive multimedia form. I do not
see anything terrifying in the reality that the human condition changes. It is not a curse, but a
blessing. Corresponding to the new pragmatic framework, we are all experiencing the need to
adapt more rapidly, and sometimes to trade depth for breadth. We do not have enough
courage to discard everything that is still based on the structure of the previous pragmatic
framework. The program in which I teach just built a new arts and technology teaching and
learning facility: the same factory model; the same centralized, hierarchic structure. In reality,
such a building should not have been erected. (Never mind that the pretentious architecture
bore the stamp of someone who does not understand what education in our time should be!)
On the one hand, there are the big pronouncements regarding the state of science and
technology; on the other, captivity to the past. Conflict does not scare me. I see in conflict the
possibility of an authentic revolution in education and in many other societal activities. What
scares me is the deeply ingrained conformity to the medieval model of teaching and learning.
And the demagoguery associated with monetizing all there is. The Faustian trade-off is
skewed: I will give you the illusion of eternity in exchange for your abdicating your desire to
discover what it means to live.
RS: How do you see this Faustian trade-off (coming) in place? Are you talking about the
computational, digital turn in the Humanities?
MN: A recent book on Digital Humanities (Anne Burdick, Johann Drucker, Peter Lunenfeld,
Todd Presner, Jeffrey Schnapp, MIT Press 2012) claims that “Digital Humanities is born of the
encounter between traditional humanities and computational methods.” Of course “recent”
does not qualify as “significant”. We learn from the text (and the comments it triggered) that
“Digital Humanities is a generative practice”, and that it “contributes to the ‘screen culture’” of
the 21st century. But we do not gain access to the questions of the human condition. We
learn about design, but not from an informed perspective of the activity, rather on account of
a reactive process of design that lacks a visionary dimension. McLuhan is quoted (again, the
echo chamber metaphor is quite well illustrated in the tone of the writing); so are John Berger,
Scott McCloud (on comics), and even Charles and Ray Eames. In respect to computation, the
discourse is even more muddled. The words are often right; missing is the deeper
understanding of the dynamics of human existence and activity. The applied aspect made the
book a good candidate for adoption – and explains why it was funded: it promotes a notion of
humanity congruent with that of technology.
In reality, “Humanities” is the expression of resistance. Those involved in humanities probe
the science and technology instead of automatically accepting them. These remarks should
not be construed as a book review. I use the book as an opportunity to recognize those
honestly interested in understanding what is happening in our days, but also to point out that
the endeavor is complicated by the fact that we are part of the process. You don’t have
insight into the earthquake that reshapes the landscape. The hype over big data is of the
same nature as the hype over the digital (sic!) humanities. Humanities – i.e., the many
disciplines that fit under this heading – is rushing into a territory of methods and perspectives
defined for purposes different from those of the humanities. To give up the long view for the
immediacy of results is not a good trade-off. I am amused by those great “humanists” who
seek out programmers for testing their own ideas. Smiling, we bid farewell to the past (some
might recognize behind this formulation an author who saw part of this coming).
RS: Let me bring in another aspect of this. Computation – or algorithmic reading – has been a
tool of research in the humanities for some time. Digital Humanities aims at the application of
digital processes and resources for text and image analysis, large data mining, and data
visualization. The rationale behind it: Machines are better in processing data than humans.
However, the reading that algorithms carry out is “distant” in contrast to the close reading by
humans. Your comment to Digital Humanities above is quite straight and critical. In the same
spirit you state in your article “Reassessing the Foundations of Semiotics”: Quantity does not
automatically lead to improved comprehension. The challenging semiotic project is, as you
continue, not only to find information in big data, but also meaning in information. What do
you expect from Digital Humanities in terms of reassessed semiotics?
MN: The great assumption is that there is a universal machine: the Turing machine. This
assumption has led to the spread of the most insidious forms of determinism. Algorithmic
computation became the magic formula for fighting disease, making art, and building rockets.
It is forgotten that Turing defined only a specific form of automated mathematics. Universities,
as centers of inquiry, were only too happy to replace the thinking of previous ages with the
inquiry associated with virtual machines. They housed the big mainframe machines.
Everything became Turing computational, and at the same time, as circular as the underlying
premise. If you can describe an activity – that is, if you have an algorithm – algorithmic
computation would perform that particular operation as many times as you wished, and in
every place where that operation is involved. As long as the focus is on algorithmic
descriptions, computation is assumed to be universal. Indeed, the arithmetic behind selling
tomatoes in a market or exploring the moon became the same.
It turns out that quite a number of problems – the most interesting ones, actually – are not
algorithmic. Protein folding, essential in living processes, is one example. So is computer
graphics, involving interactive elements. Furthermore, adaptive processes can not be
described through algorithmic rules. More important, anticipatory processes refuse to fit into
neat algorithmic schemes. At the time when I advanced the notion that the computer is a
semiotic engine, my enthusiasm was way ahead of my ability to understand that the so-called
universal machine is actually one of many others. Today we know of DNA programming,
neural network computation, and membrane computation, some equivalent to a Turing
machine, some not.
We are not yet fully aware that the knowledge domain covered by the universal computation
model (the Turing machine) is relatively small. We are less aware of the fact that specific
forms of computation are at work in the expression of the complexity characteristic of the
living. The university is still “married” to the deterministic model of computation because
that’s where the money is. If you want to control individuals, determinism is what you want to
instill in everything: machines, people, groups. Once upon a time, the university contributed to
a good understanding of the networks. Today, it only delivers the trades-people for all those
start-ups that shape the human condition through their disruptive technologies way more than
universities do. Working on a new foundation for semiotics, I am inclined to see semiotics as
foundational for the information age. But that is a different subject. If and when my work is
done, I would gladly continue the dialog.

Source: http://www.nadin.ws/wp-content/uploads/2013/11/next-issue-_-dichtung-digital.pdf

enzyklopaedie.ch

Bei diesem PDF-File handelt es sich um die Internet-Version einesAufsatzes, der überarbeitet und korrigiert erschienen ist als:Paul Michel: »Ignorantia exsilium hominis«. Zu einem enzyklopädischenTraktat des Honorius Augustodunensis. in: Strenarum lanx. Beiträge zurPhilologie und Geschichte des Mittelalters und der Frühen Neuzeit;Festgabe für Peter Stotz zum 40-jährigen Jubiläum desMitte

Clr 2009 v9i2 side effects may include headache and eye strain shannon grace stevens

“Side Effects May Include Headache and Eye Strain” Talk to Your Doctor about Prescription Drug Advertising: A Suggestion for Revision of Prescription Drug Advertising Table of Contents Because of the protection of the learned intermediary doctrine and in light of reduced disclosure requirements for television and radio advertisements, advertisers of prescription drugs in print publicat

Copyright © 2010 Health Drug Pdf