Sort Blog Posts

Sort Posts by:

  • in
    from   

Suggest a Blog

Enter a Blog's Feed URL below and click Submit:

Most Commented Posts

In the past 7 days

Recent Comments

Recently Viewed

JacketFlap Sponsors

Spread the word about books.
Put this Widget on your blog!
  • Powered by JacketFlap.com

Are you a book Publisher?
Learn about Widgets now!

Advertise on JacketFlap

MyJacketFlap Blogs

  • Login or Register for free to create your own customized page of blog posts from your favorite blogs. You can also add blogs by clicking the "Add to MyJacketFlap" links next to the blog name in each post.

Blog Posts by Tag

In the past 7 days

Blog Posts by Date

Click days in this calendar to see posts by day or month
<<June 2024>>
SuMoTuWeThFrSa
      01
02030405060708
09101112131415
16171819202122
23242526272829
30      
new posts in all blogs
Viewing: Blog Posts Tagged with: Dennis Baron, Most Recent at Top [Help]
Results 26 - 37 of 37
26. It’s alive! New computer learns language like a human, almost.

By Dennis Baron


A computer at Carnegie Mellon University is reading the internet and learning from it in much the same way that humans learn language and acquire knowledge, by soaking it all up and figuring it out in our heads.

People’s brains work better some days than others, and eventually we will all run out of steam, but the creators of NELL, the Never Ending Language Learner, want it to run forever, getting better every day in every way, until it becomes the largest repository imaginable of all that’s e’er been thought or writ.

Since the first “electronic brains” began to appear in the late 1940s, it has been the goal of computer engineers and the occasional mad scientist to fashion machines that think and learn like people do. Or at least machines that perform functions analogous to some aspects of human thought, and which also self-correct by analyzing their mistakes and doing better next time around.

Setting out to create an infinite and immortal database is a big task: there’s a lot for NELL to learn in cyberspace, and a whole lot more that has yet to be digitized. But since NELL was activated a few months ago it has learned over 440,000 separate things with an accuracy of 74% which, to put it in terms that any Carnegie Mellon undergraduate can understand, is a C. In contrast, I have no idea how to count what I’ve learned since my own brain went on line, and no idea how many of the things that I know are actually correct, which suggests that all I’ve got on my cerebral transcript is an Incomplete.

NELL’s programmers seeded it with some facts and relations so that it had something to start with, then set it loose on the internet to look for more. NELL sorts what it finds into categories like mountains, scientists, writers, reptiles, universities, web sites, or sports teams, and relations like “teamPlaysSport, bookWriter, companyProducesProduct.”

NELL also judges the facts it finds, promoting some of them to the higher category of “beliefs” if they come from a single trusted source, or if they come from multiple sources that are less reliable. According to the researchers, “More than half of the beliefs were promoted based on evidence from multiple [i.e., less reliable] sources,” making NELL more of a rumor mill than a trusted source. And once NELL promotes a fact to a belief, it stays a belief: “In our current implementation, once a candidate fact is promoted as a belief, it is never demoted,” a process that sounds more like religion than science.

Sometimes NELL makes mistakes: the computer incorrectly labeled “right posterior” as a body part. NELL proved smart enough to call ketchup a condiment, not a vegetable, a mislabeling that we owe to the “great communicator,” Ronald Regan. But its human handlers had to tell NELL that Klingon is not an ethnic group, despite the fact that many earthlings think it is. Alex Trebek would be happy to know that, unlike Sean Connery, NELL has no trouble classifying therapists as a “profession,” but the computer trips up on the rapists, which it thinks could possibly be “awardtrophytournament” (confidence level, 50%).

NELL knows that cookies are a “baked good,” but that caused the computer to assume that persistent cookies and internet cookies are also baked goods. But that’s not surprising, since it still hasn’t learned what metaphors are—NELL is only 87.5% confident that metaphors are “tools” (plus, according to NELL, there’s a 50-50 chance that metaphors are actually “book writers”).

Told by its programmers that Risk is a board game, NELL predicts w

0 Comments on It’s alive! New computer learns language like a human, almost. as of 1/1/1900
Add a Comment
27. The English Language Unity Act: Big Government Only a Tea Partier Could Love

By Dennis Baron


Tea Partiers seem intent on throwing more and more of the American government overboard. Yet there’s one area where both these wing nuts and many ordinary conservatives support more big government, not less: they want the government to make everyone in America speak English.

Christine O’Donnell, an upstart Tea Party candidate who beat a more traditional Republican congressman in Delaware’s senate primary, says that she “will fight to make English America’s official language for all governmental purposes,” adding, “We cannot be one people without speaking ONE language in common.” In 2007 O’Donnell told FoxNews that American scientists deep in their underground laboratories had created mice with “fully functioning human brains”—experiments which she opposes, unless of course the researchers can guarantee that the mice will speak English.

Tea Partiers support official English because they believe that not speaking English is prima facie evidence that you’re an illegal immigrant who swam across the Rio Grande. They’ve forgotten that English itself is an immigrant language, not just in the U. S. of A., where it clambered ashore “without papers” along with the pilgrims and the Virginia colonists in the early 1600s, but also in England, where, though it wasn’t called English, ‘the language of the Angles,’ till it got to Britain, it swam the North Sea with marauding Angles and Saxons in the 5th century, CE. Everyone in the Tea Party seems also to forget that these English-speaking illegals eventually turned merry old England into a socialist state with government-run health care.

Despite the fact that English has now gone global and is spoken by more people across the planet than any world language ever before, some Americans fear that English is endangered at home. That’s why Iowa’s Rep. Steve King, a career politician who wants government out of our lives (except for banning abortions and term limits), but firmly believes that government must dictate our language preference, bullied Iowans into passing an official English law in 2002. King scared Iowans with Census figures showing that the state’s Hispanic population had doubled since 1990, going from 1% to 2%, ignoring the fact that more than half of the state’s 80,000 Spanish speakers also spoke English well or very well and the rest were learning it as fast as they could. King then successfully sued his own state for posting election information on its website in languages other than English in violation of the new official English law, further demonstrating his belief that it is the job of government to meddle with the lives of citizens and limit their individual choices.

King then introduced the English Language Unity Act of 2009 in the House of Representatives in order to make English the official language of the United States. H.R. 997 requires English for all official government actions, everything from our laws, which are already in English, to anything that the government does that is “subject to scrutiny by either the press or the public,” which seems to cover everything from committee reports, hearings, and press briefings subject to the Freedom of Information Act, to the sexual peccadilloes and personal indiscretions that some members of Congress keep inadvertently exposing to the scru

0 Comments on The English Language Unity Act: Big Government Only a Tea Partier Could Love as of 1/1/1900
Add a Comment
28. The Gender-Neutral Pronoun: 150 Years Later, Still an Epic Fail

By Dennis Baron


Every once in a while some concerned citizen decides to do something about the fact that English has no gender-neutral pronoun. They either call for such a pronoun to be invented, or they invent one and champion its adoption. Wordsmiths have been coining gender-neutral pronouns for a century and a half, all to no avail. Coiners of these new words insist that the gender-neutral pronoun is indispensable, but users of English stalwartly reject, ridicule, or just ignore their proposals.

Recently, Guardian columnist Lucy Mangan called for a gender-neutral pronoun:

The whole pronouns-must-agree-with-antecedents thing causes me utter agony. Do you know how many paragraphs I’ve had to tear down and rebuild because you can’t say, “Somebody left their cheese in the fridge”, so you say, “Somebody left his/her cheese in the fridge”, but then you need to refer to his/her cheese several times thereafter and your writing ends up looking like an explosion in a pedants’ factory? … I crave a non-risible gender-neutral (not “it”) third person sing pronoun in the way normal women my age crave babies.          The Guardian, July 24, 2010, p. 70

English is a language with a vocabulary so large that every word in it seems to have a dozen synonyms, and yet this particular semantic black hole remains unfilled. As Tom Utley complains in the Daily Mail,

It never ceases to infuriate me, for example, that in this cornucopia of a million words, there’s no simple, gender-neutral pronoun standing for ‘he-or-she’.

That means we either have to word our way round the problem by using plurals – which don’t mean quite the same thing – or we’re reduced to the verbose and clunking construction: ‘If an MP steals taxpayers’ money, he or she should be ashamed of himself or herself.’ (‘Themselves’, employed to stand for a singular MP, would, of course, be a grammatical abomination).          London Daily Mail, June 13, 2009

The traditional gender agreement rule states that pronouns must agree with the nouns they stand for both in gender and in number. A corollary requires the masculine pronoun when referring to groups comprised of men and women. But critics argue that such generic masculines – for example, “Everyone loves his mother” – actually violate the gender agreement part of the pronoun agreement rule. And they warn that the common practice of using they to avoid generic he violates number agreement: in “Everyone loves their mother,” everyone is singular and their is plural. Only a new pronoun, something like ip, coined in 1884, can save us from the error of the generic masculine or the even worse error of singular “they.”

Such forms as co, xie, per, and en abound in science fiction, where gender is frequently bent, and they pop up with some regularity in online transgender discussion groups, where the traditional masculine and feminine pronouns are out of place. But today’s word coiners seem unaware that gender-neutral English pronouns have been popping up, then disappearing without much trace, since the mid-nineteenth century.

According to an 1884 article in the New-York Commercial Advertiser<

0 Comments on The Gender-Neutral Pronoun: 150 Years Later, Still an Epic Fail as of 1/1/1900
Add a Comment
29. Good grammar leads to violence at Starbucks?

By Dennis Baron


Apparently an English professor was ejected from a Starbucks on Manhattan’s Upper West Side for – she claims – not deploying Starbucks’ mandatory corporate-speak. The story immediately lit up the internet, turning her into an instant celebrity. Just as Steven Slater, the JetBlue flight attendant who couldn’t take it anymore, became the heroic employee who finally bucked the system when he cursed out nasty passengers over the intercom and deployed the emergency slide to make his escape, Lynne Rosenthal was the customer who cared so much about good English that she finally stood up to the coffee giant and got run off the premises by New York’s finest for her troubles. Well, at least that’s what she says happened.

According to the New York Post, Rosenthal, who teaches at Mercy College and has an English Ph.D. from Columbia, ordered a multigrain bagel at Starbucks but “became enraged when the barista at the franchise” asked, “Do you want butter or cheese?” She continued, “I refused to say ‘without butter or cheese.’ When you go to Burger King, you don’t have to list the six things you don’t want. Linguistically, it’s stupid, and I’m a stickler for correct English.” When she refused to answer, she claims that she was told, “You’re not going to get anything unless you say butter or cheese!” And then the cops came.

Stickler for good English she may be, but management countered that the customer then made a scene and hurled obscenities at the barista, and according to the Post, police who were called to the scene insist that no one was ejected from the coffee shop.

I too am a professor of English, and I too hate the corporate speak of “tall, grande, venti” that has invaded our discourse. But highly-paid consultants, not minimum-wage coffee slingers, created those terms (you won’t find a grande or a venti in Italian coffee bars). Consultants also told “Starbuck’s” to omit the apostrophe from its corporate name and to call its workers baristas, not coffee-jerks.

My son was a barista (should that be baristo?) at Borders (also no apostrophe, though McDonald’s keeps the symbol, mostly) one summer, and many of my students work in restaurants, bars, and chain retail stores. The language that employees of the big chains use on the job is carefully scripted and choreographed by market researchers, who insist that employees speak certain words and phrases, while others are forbidden, because they think that’s what moves “product.” Scripts even tell workers how and where and when to move and what expression to paste on their faces. Employees who go off-script and use their own words risk demerits, or worse, if they’re caught by managers, grouchy customers, or the ubiquitous secret shoppers who ride the franchise circuit looking for infractions.

I’m no fan of this corporate scripting. Calling customers “guests” and employees “associates” doesn’t mean I can treat Target like a friend’s living room or that the clerks who work there are anything but low-level employees who associate with one another, not with corporate vice presidents. I don’t think this kind of language-enforcement increases sales or

0 Comments on Good grammar leads to violence at Starbucks? as of 1/1/1900
Add a Comment
30. Robot Teachers! (Coming Soon to a Classroom Near You)

Dennis Baron is Professor of English and Linguistics at the University of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he discusses the increasingly popular trend of using robots in classrooms. Read his previous posts here.

They’re coming, and they’ll be here by September! Robot teachers, programmed with a single mission: to save our failing schools.

Funded by the Frankenstein Foundation, computer engineers in secret mountain laboratories and workshops hidden deep below the desert floor are feverishly soldering chips and circuit boards onto bits of aluminum to create mechanical life forms whose sole purpose is to teach English.

We need this invasion of English-teaching robots because, according to researchers at the University of California, San Diego, “an unprecedented number of children in the US start public school with major deficits in basic academic skills, including vocabulary skills.” So computer scientists at UCSD’s Machine Perception Laboratory designed RUBI, a “sociable robot” who successfully taught a group of toddlers ten vocabulary words in only twelve days. RUBI improved the children’s word-mastery by a full 25% compared to a control set of words not taught by the mechanical wonder.

In another experiment, another RUBI the Robot successfully taught English-speaking preschoolers nine words in Finnish (Finnish is notoriously difficult to learn because it is unrelated to any other language). And Korean educators report similar success with ENGKEY, yet another robot English teacher. English is mandatory in all Korean schools, and the government hopes to replace expensive and hard-to-recruit native speakers of American or Canadian English with expensive and hard-to-maintain machines like ENGKEY, a robot programmed to recognize and respond to human speech.

Like computers, robots appeal to school administrators who think the machines are smarter, cheaper, more efficient, and less likely to talk back or take sick days, than human teachers. And they appeal as well to legislators, government officials, and employers who are concerned with low test scores, high drop-out rates, and global economic competition.

To put the robot teacher invasion into context, we should remember that using technology to teach is hardly a new idea. Books are a teaching technology, though anyone who has studied a foreign language only from a textbook and then tried watching a foreign-language film knows that even tried-and-true book-learning has its limitations.

As for the newer communication technologies, when they came on the scene, telephones, radio, film, and television were all going to deliver information to students faster and more efficiently than any teacher could. What I learned from educational radio in the fourth grade was how to sleep in class with my eyes open, and what I learned from filmstrips in school was how long it took for the heat of

0 Comments on Robot Teachers! (Coming Soon to a Classroom Near You) as of 1/1/1900
Add a Comment
31. Revising Our Freedom

Dennis Baron is Professor of English and Linguistics at the University of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he explains how digital archaeology has revealed Thomas Jefferson’s revisions of the Declaration of Independence. Read his previous posts here.

Jefferson on the two-dollar bill

In a list of the colonies’ grievances against King George III Jefferson wrote, “he has incited treasonable insurrections of our fellow-subjects, with the allurements of forfeiture and confiscation of our property.” But the future president, whose image now graces the two-dollar bill, must have realized right away that “fellow-subjects” was the language of monarchy, not democracy, because “while the ink was still wet” Jefferson took out “subjects” and put in “citizens.”

In a eureka moment, a document expert at the Library of Congress examining the rough draft late at night suddenly noticed that there seemed to be something written under the word citizens. It was no Da Vinci code or treasure map, but Jefferson’s original wording, soon uncovered using a technique called “hyperspectral imaging,” a kind of digital archeology that lets us view the different layers of a text. The rough draft of the Declaration was digitally photographed using different wavelengths of the visible and invisible spectrum. Comparing and blending the different images revealed the word that Jefferson wrote, then rubbed out and wrote over.

Excerpt from rough draft of Declaration showing the sentence that Jefferson edited

detail from the above, showing

Above: Jefferson’s rough draft reads, “he has incited treasonable insurrections of our fellow-citizens”; followed by a detail of “fellow-citizens” with underwriting visible in ordinary light. Below: a series of hyperspectral images made by the Library of Congress showing that Jefferson’s initial impulse was to write “fellow-subjects.” [Hi-res images of the rough draft are available at the Library of Congress website.] Elsewhere in the draft Jefferson doesn’t hesitate to cross out and squeeze words and even whole lines in as necessary, but in this case he manages to fit his emendation neatly into the same space as the word it replaces.

0 Comments on Revising Our Freedom as of 1/1/1900

Add a Comment
32. The Book, the Scroll, and the Web

Dennis Baron is Professor of English and Linguistics at the better pencilUniversity of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he looks at the difference between scrolls and codexes.

The scroll, whose pages are joined end-to-end in a long roll, is older than the codex, a writing technology — known more familiarly as the book — with pages bound together at one end. Websites have always looked more like scrolls than books, a nice retro touch for the ultra-modern digital word, but as e-readers grow in popularity, texts are once again looking more like books than scrolls. While the first online books, the kind digitized by the Gutenberg Project in the 1980s, consisted of one long, scrolling file, today’s electronic book takes as its model the conventional printed book that it hopes one day to replace.

Fans of the codex insist that it’s an information delivery system superior in every way to the scroll, and whether or not they approve of ebooks, they think that all books should take the form of codices. For one thing, book pages can have writing on both sides, making them more economical than scrolls, which are typically written on one side only (this particular codex advantage turns out to be irrelevant for ebooks). For another, the codex format makes it easier to compare text on different pages, or in different books, which some scholars think fosters objective, critical, or scientific thinking. It’s also easier to locate a particular section of a codex than to roll and unroll a scroll looking for something. These may or may not be advantages for books over scrolls, but it’s not a problem online, where keyword searching makes it easy to find digitized text in a nanosecond, regardless of its format, plus it’s possible to compare any online texts or the parts thereof simply by opening each in a different window and clicking from one to another. In the world of the ebook, codex or scroll becomes a preference, not an advantage.

A few tunnel-visioned readers associate the codex with Christianity, viewing scrolls as relics of heathen religion. Not to be outdone, some people see online books as messianic, and others think they represent the ultimate heresy — but religion aside, there’s no particular advantage for page over scroll in either the analog or the digital world. Finally, although this example of codex superiority is seldom mentioned, the codex can be turned into a flip book by drawing cartoons on the pages and then fanning them so the images appear to move. But then again, a motion picture is really a scroll full of pix unwinding at 24 frames per second. None of this makes a difference if your ebook, iPad, or smartphone won’t play Flash video.

There is one advantage of the book over the scroll that may apply to the computer. According to psychologists Christopher A. Sanchez and Jennifer Wiley, poor readers have more trouble understanding scrolled text on a computer than digital text presented in a format resembling the traditional printed page. But these researchers found that better readers, those with stronger working memories, understand scrolls and pages equally well.

While Sanchez and Wiley’s experiments suggest that for some readers, paging is better for comprehension than scrolling, their results are o

0 Comments on The Book, the Scroll, and the Web as of 1/1/1900
Add a Comment
33. Wikipedia: Write First, Ask Questions Later

Dennis Baron is Professor of English and Linguistics at the better pencilUniversity of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he looks at Wikipedia.

Admit it, we all use Wikipedia. The collaborative online encyclopedia is often the first place we go when we want to know a fact, a date, a name, an event. We don’t even have to seek out Wikipedia: in many cases it’s the top site returned when we google that fact, date, name, or event. But as much as we’ve come to rely on it, Wikipedia is also the online source whose reliability we most often question or ridicule.

Wikipedia is the ultimate populist text, a massive database of more than 3.2 million English-language entries and 6-million-plus entries in other languages. Anyone can write a Wikipedia article, no experience necessary. Neither is knowing anything about the subject, since Wikipedians — you can be one too — can simply copy information from somewhere else on the internet and post it to Wikipedia. It doesn’t matter if the uploaded material is wrong: that can be fixed some other time. Wikipedia’s philosophy comes right out of the electronic frontier’s rough justice: write first, ask questions later.

When it comes to asking those questions, doing the fact-checking, Wikipedia depends on the kindness of strangers. Once an article on any topic is uploaded, anyone can read it, and any Wikipedian can revise or edit it. And then another Wikipedian can come along and revise or edit that revision, ad infinitum. Of course not every error is apparent, and not every Wikipedian will bother to correct an error even if they notice one. Wikipedians can even delete entries, if they find fault with them, but then other Wikipedians can decide to reinstate them.

Such sketchy reliability is why many teachers warn students not to use Wikipedia in their research. This despite the fact that a 2005 Nature study showed that, so far as a selection of biology articles was concerned, Wikipedia’s reliability was on a par with that of the Encyclopaedia Britannica. But teachers don’t want their students using the Britannica either. Wikipedia actually offers an article about its own reliability, though the accuracy of that article remains to be determined.

A study by researchers at the University of Washington finds that most students use Wikipedia, even though their instructors warn them not to. Not only that, but students in architecture, science, and engineering are the most likely to use Wikipedia. Apparently those students, whose disciplines depend on accurate measurements and verifiable evidence, don’t expect accuracy from their works cited. According to the study, students in the social sciences and humanities, subjects emphasizing argumentation and critical reading, are less-frequent users of Wikipedia. Unfortunately, the Washington researchers didn’t ask these students how much they rely on Spark Notes.

It turns out that students also distrust Wikipedia, though not with the same intensity their teachers do. Only 16% of students find Wikipedia articles credible. To

0 Comments on Wikipedia: Write First, Ask Questions Later as of 1/1/1900
Add a Comment
34. Should Everybody Write? Or is There Enough Junk on the Internet Already?

Dennis Baron is Professor of English and Linguistics at the better pencilUniversity of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he looks at  writing on the internet.

“Should everybody write?” That’s the question to ask when looking at the cyberjunk permeating the World Wide Web.

The earlier technologies of the pen, the printing press, and the typewriter, all expanded the authors club, whose members create text rather than just copying it. The computer has expanded opportunities for writers too, only faster, and in greater numbers. More writers means more ideas, more to read. What could be more democratic? More energizing and liberating?

But some critics find the glut of internet prose obnoxious, scary, even dangerous. They see too many people, with too little talent, writing about too many things.

Throughout the 5,000 year history of writing, the privilege of authorship was limited to the few: the best, the brightest, the luckiest, those with the right connections. But now, thanks to the computer and the internet, anyone can be a writer: all you need is a laptop, a Wi-Fi card, and a place to sit at Starbucks.
The internet allows writers to bypass the usual quality-controls set by reviewers, editors and publishers. Today’s authors don’t even need a diploma from the Famous Writers School. And they don’t need to wait for motivation. Instead of staring helplessly at a blank piece of paper the way writers used to, all they need is a keyboard and right away, they’ve got something to say.

You may not like all that writing, but somebody does. Because the other thing the internet gives writers is readers, whether it’s a nanoaudience of friends and family or a virally large set of FBFs, Tweeters, and subscribers to the blog feed. Apparently there are people online willing to read anything.

Previous writing technologies came in for much the same criticism as the internet: too many writers, too many bad ideas. Gutenberg began printing bibles in the 1450s, and by 1520 Martin Luther was objecting to the proliferation of books. Luther argued that readers need one good book to read repeatedly, not a lot of bad books to fill their heads with error. Each innovation in communication technology brought a new complaint. Henry David Thoreau, never at a loss for words, wrote that the telegraph – the 19th century’s internet – connected people who had nothing to say to one another. And Thomas Carlyle, a prolific writer himself, insisted that the explosion of reading matter made possible by the invention of the steam press in 1810 led to a sharp decline in the quality of what there was to read.

One way to keep good citizens and the faithful free from error and heresy is to limit who can write and what they can say. The road to publication has never been simple and direct. In 1501, Pope Alexander VI’s Bulla inter multiples required all printed works to be approved by a censor. During the English Renaissance, when literature flourished and even kings and queens wrote poetry, Shakespeare couldn’t put on a play without first getting a license. Censors were a kind of low-tech firewall, but just as there have always been censors, there have always been writers evading them and readers willing, or even anxious, to devour anything on the do-not-read list.

Today crit

0 Comments on Should Everybody Write? Or is There Enough Junk on the Internet Already? as of 1/1/1900
Add a Comment
35. Sliced Bread 2.0

Dennis Baron is Professor of English and Linguistics at the better pencilUniversity of Illinois. His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he looks at the success of the internet.

You’ve heard the Luddite gripes about the digital age: computers dehumanize us; text messages are destroying the language; Facebook replaces real friends with imaginary ones; instant messages and blogs give people a voice who have nothing to say. But now a new set of complaints is emerging, this time from computer scientists, internet pioneers who once promised that the digital revolution was the best thing since sliced bread, no, that it was even better, Sliced Bread 2.0.

It started in the mid-1990s with Clifford Stoll. You may remember Stoll as the Berkeley programmer who tracked down a ring of eastern European hackers who were breaking into secure military computers, and wrote up the adventure in the 1990 best-seller, The Cuckoo’s Egg. But a mere five years later Stoll published Silicon Snake Oil, a condemnation of the internet as oversold and underperforming. In a 1995 Newsweek op-ed, Stoll summed up the internet’s failed promise of happy telecommuters, online libraries, media-rich classrooms, virtual communities, and democratic governments in one word: “Baloney.”

More nuanced is the critique of Jaron Lanier, the programmer who brought us virtual reality, but who now labels life online “digital maoism.” In a recent interview in the Guardian, Lanier charged that after thirty years the great promise of a free and open internet has brought us not burgeoning communities of online musicians, artists, and writers, but “mediocre mush”; a pack mentality; recreations of things that were better done with older technologies; an occasional Unix upgrade; and an online encyclopedia. His conclusion: it’s all “pretty boring.”

And although internet guru Jonathan Zittrain praises the first personal computers and the early days of the internet for promoting unlimited creativity and exploration, he warns that the generative systems which enabled users to create new ways of being and communicating are giving way to tethered devices like smart phones, Kindles, Tivos, and iPads, all of which channel our communications and program our entertainment along safe and familiar paths and prohibit inventive tinkering. Zittrain reminds us that the PC was a blank slate, a true tabula rasa that let imaginative, technically-accomplished users repurpose it over and over again, but he fears that the internet appliance of the future will be little more than a hi-tech toaster programmed to let us do only what the marketing departments at Apple, Microsoft, Google, and Amazon want us to do.

It’s easy to ignore the Luddites. The internet isn’t destroying English (you’re reading this online, right?) or replacing face-to-face human interaction (Facebook or no Facebook, babies continue to be born). Plus, we’re all using computers and the ‘net, so how bad can they be?

But what about the informed critiques of experts like Stoll, Jaro

0 Comments on Sliced Bread 2.0 as of 1/1/1900
Add a Comment
36. Multitasking: Learning to Teach and Text at the Same Time

Dennis Baron is Professor of English and Linguistics at the University of Illinois. better pencilHis book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he looks at multitasking in a digital world.

Most of my students belong to the digital generation, so they consider themselves proficient multitaskers. They take notes in class, participate in discussion, text on their cell phones, and surf on their laptops, not sequentially but all at once. True, they’re not listening to their iPods in class, and they may find that inconvenient, since they like a soundtrack accompanying them as they go through life. But they’re taking advantage of every other technology they can cram into their backpacks. They claim it helps them learn, even if their parents and teachers are not convinced.

Too old to multitask? The author texting while writing on a laptop and listening to tunes.

Too old to multitask? The author texting while writing on a laptop and listening to tunes.

Recently one of my students, a college senior, added to this panoply of technology an older form of classroom inattention: while I explored the niceties of English grammar, he was doing homework for another class. When I asked him to put away the homework and pay attention, he replied that he was paying attention, just multitasking to maximize efficiency. “I can multitask too,” I said, taking out my cell phone and starting to text as I went on with the lesson.

My students didn’t like this. They expected their teacher’s full attention, even if they weren’t going to give me theirs. Plus, they argued, “When you text, you have to stop talking so you can look at the keyboard. That’s not multitasking.” I was using a computer before most of them were born, but they were right, I can’t talk and text. Their pitying expressions said it all: too old to multitask. But what really got them was the thought that I might actually want to multitask, that I might be able to sneak in another activity while I was teaching them.

Although it’s gotten a lot of attention in the digital age, multitasking isn’t new, nor is it the sole property of the young. We commonly do two things at once — singing while playing an instrument, driving while talking to a passenger, surfing the web while watching TV. Despite the fact that a growing body of research suggests that multitasking decreases the efficiency with which we perform simultaneous activities, the idiom he can’t walk and chew gum at the same time shows that we expect a certain amount of multitasking to be normal, if not mandatory.

As for predigital, adult multitasking, office workers have been typing, answering phones, and listening to music, since, like, forever, without any loss of efficiency, except of course when Richard Nixon’s secretary, Rose Mary Woods, blamed the 18½ minute gap on one of the Watergate tapes on a multitasking mishap. Woods was listening to the tape and transcribing it when the phone rang. As she leaned

0 Comments on Multitasking: Learning to Teach and Text at the Same Time as of 1/1/1900
Add a Comment
37. Technology Reduces the Value of Old People, Warns MIT Computer Guru

Dennis Baron is Professor of English and Linguistics at the University of Illinois.better pencil His book, A Better Pencil: Readers, Writers, and the Digital Revolution, looks at the evolution of communication technology, from pencils to pixels. In this post, also posted on Baron’s personal blog The Web of Language, he looks at the dilemma of being old in the internet age.

Philip Greenspun, an MIT software engineer and hi-tech guru, argues in a recent blog post that “technology reduces the value of old people.” It’s not that old people don’t do technology. On the contrary, many of them are heavy users of computers and cell phones. It’s that the young won’t bother tapping the knowledge of their elders because they can get so much more, so much faster, from Wikipedia and Google.

It was adults, not the young, who invented computers, programmed them, and created the internet. OK, maybe not old adults, in some cases maybe not even old-enough-to-buy-beer adults, but adults nonetheless. Plus, the over-35 set is Facebook’s fastest growing demographic.

Even so, despite starting the computer revolution, and despite their presence on the World Wide Web today, the old are fast becoming irrelevant. According to Greenspun, “An old person will know more than a young person, but can any person, young or old, know as much as Google and Wikipedia? Why would a young person ask an elder the answer to a fact question that can be solved authoritatively in 10 seconds with a Web search?”

Why indeed? With knowledge located deep in Google’s server farms instead of in the collective memories of senior citizens, the old today are fast becoming useless. Might as well put them out on the ice floe and let them float off to whatever comes next.

According to the federal government, which is never wrong about these things, I myself became officially old, and therefore useless as a repository of wisdom and memory, last Spring. But I’m not worried about being put out to sea on an ice floe, because thanks to global warming, the ice is melting so fast that it poses no danger. There’s not even enough ice out there to sink another Titanic, though if someone built a new Titanic people wouldn’t sail on it because it probably wouldn’t have free wi-fi.

I found out all I know about global warming and the shrinking ice caps and even the Titanic not from that well-known American elder, Al Gore, but from Wikipedia. Wikipedia also told me that Al Gore, who is no spring chicken, invented the internet. I learned from Google that there was no free wi-fi before the internet, and no such thing as a free lunch.

Socrates once warned that our increased reliance on writing would weaken human memory — everything we’d need to remember would be stored in documents, not brain cells, so instead of remembering stuff, we could just look it up. Socrates knew all about brain cells, of course, because he looked that up in a Greek encyclopedia (he didn’t use the Encyclopaedia Britannica, because he couldn’t read English). And just as he predicted, Socrates, who was no spring chicken, had to look up brain cells again a week later, because he forgot what it said.

2,400 years have passed since Socrates drank hemlock — that was his fellow Athenians’ way of putting an irrelevant old man out to sea — but it looks like our current dependence on computers is rendering old people’s memories irrelevant once again. And that’s probably a good thing, because as Socrates learned the hard way, old people’s memories are notoriously unreliable, which is why Al Gore, who foresaw that this would happen, also invented sticky notes.

309David’s “The Death of Socrates.” We remember the Greek philosopher’s critique of writing because his student Plato wrote it down on sticky notes.

Like old people, old elephants are also no longer necessary. Elephants became an endangered species not because hunters killed them for the ivory in their tusks but because now that we have computers, no one cared that an elephant never forgets. Technology reduced the value of elephants, and so the elephants just wandered off to the elephants’ burial ground to wait for whatever comes next. And also because the elephants’ burial ground has free wi-fi.

Unlike elephants and people, computers never forget, so we can rest assured that the value of computers will never be reduced. Unlike fallible life-form-based memory banks, computers preserve their information forever, regardless of disk crashes, magnetic fields, coffee spills on keyboards, or inept users who accidentally erase an important file.

And there’s no need to throw out your 5.25″ floppies, laser disks, minidisks, Betamax, 8-track, flash drives, or DVDs just because some new digital medium becomes popular, because unlike writing on clay, stone, silk, papyrus, vellum, parchment, newsprint, or 100% rag bond paper, all computerized information is always forward-compatible with any upgrades or innovations that come along.

Plus all the information stored in computer clouds is totally reliable and always available, except of course for those pesky T-Mobile Sidekick phones whose data somehow disappeared. Assuming the cable’s not down, Google invariably shows us exactly what we’re looking for, or something that’s at least close enough to it, and Wikipedia is never wrong, ever. That’s because the information on Google and Wikipedia is put there by robots, or maybe intelligent life forms from outer space, not by people of a certain age who have to write stuff down on stickies, just as Socrates did, so they don’t forget it.

And now that I don’t have to remember all that lore that elders were once responsible for, my brain cells have been freed up to do other important stuff, like spending lots more time online looking for the meaning of life and what comes next, assuming there’s free wi-fi at the coffee shop.
304

0 Comments on Technology Reduces the Value of Old People, Warns MIT Computer Guru as of 1/1/1900
Add a Comment