Friday, October 26, 2012

The Right Tool for the Job

All of my computers have become vastly overqualified.  I have a $1,500 MacBook Air, but I’m using $250 of it. Enter the new Chromebook.

Around 11 years ago, I decided to make a conscious effort to move all my data to servers accessible over the Internet (the term “Cloud” was not in vogue at the time) so that I would be able to access it from any of my computers.  The process was difficult at first, since Internet connectivity was not as ubiquitous as it is today, and there were nowhere near as many high quality cloud services available as there are now.  However, initially making use of pioneering companies such as NetLedger (now NetSuite) for accounting data and Writely (now part of Google Drive) for web-based word processing, and gradually incorporating other services such as Google Apps, Dropbox, Amazon Cloud Services, Sliderocket, Gliffy, Evernote, Mint and many others, as of roughly 3 years ago my migration to the cloud became complete.  The only application I need to run on any of my computers is the browser, and all of my data, spreadsheets, word processing documents, PDFs, photos, videos, music, movies, TV shows, presentations, diagrams, etc., is available from any Internet-connected computer.

Using my gorgeous 13” MacBook Air for my general computing chores became like using a Lamborghini to go to the corner grocery store: total overkill.  But that’s not exactly the right analogy, since there’s nothing wrong with going to the grocery store in style.  It felt more like I was not sufficiently leveraging the effort I had made to migrate to web-based services.  Thanks to the cloud, I no longer needed an expensive computer that did so much more than what I required.  It just felt wrong to the part of me that takes pride in using the proper tool for the job at hand.

That same part of me has been drawn toward Chrome OS since it was announced in July of 2009.  Yet there was no way I could justify spending $500 on a Chromebook when the computers I already owned were fulfilling my needs, and then some.  However, with the introduction of the $249 Samsung Chromebook, resistance became futile.  I ordered it as soon as it was announced, received it 2 days later and have been delightfully using it ever since (3 days so far).

Before I go any further I must admit that, in addition to the aforementioned MacBook Air and my new Chromebook, I own a Mac mini, a MacBook Pro, and an iMac (not to mention an iPhone 5, an iPad and a Nexus 7 tablet).  And I love them all.  But for the past few days, the device I’ve reached for in almost every computing situation has been the plastic, cheaply built Chromebook that was most obviously not designed by Jony Ive.  Why?  Because it’s the right tool for the job.  It is only able to do precisely what I need it to do: provide a gateway to the web.  No more, no less.  And in doing only that, as a by-product of its efficiency, it eliminates virtually all of the hassles inherent in other operating systems and native apps, like the constant updating, long boot times, exposure to viruses and the need to back up, among others.

Will our grandchildren’s computing devices be progenies of today’s iOS, Android, Windows, OS X or Linux devices, all with their native apps?  Or will they be descendants of the incredible $249 computer I’m writing this on?  I’m so in love with the flat-out efficiency of this machine, which looks like it was purchased at Toys ‘R Us but makes cheap components shine more than they have a right to, that I’m betting on Chrome OS, Web Apps and Chromebooks as the eventual winners.  To me, nothing makes more sense than having my data in both a place and a format that allows me to grab any computer and just point the browser at it to use it.  Apple and Microsoft will [eventually] lose, since they are betting their companies on expensive machines and native apps.  Google and Amazon will [eventually] win, since although they are hedging their bets, they are doubling down on the web.  You heard it here first.


Monday, September 17, 2012

What the f...?

I recently received one of those widely distributed emails containing Powerpoint presentations with all sorts of ostensibly little-known fun facts, the sort of thing that tries to make you feel that you are learning something by reading it and therefore not thoroughly wasting your time.  And, on the surface, the “facts” are sort of fun: all polar bears are left-pawed, pigs have 30-minute orgasms, and so on.  But the “fact” that caught my eye was this one: the word “fuck” originated from an acronym

The slide states, as fact, that in ancient England no one could have sex without the express permission of the king.  Therefore, couples having sex would hang a placard on their door on which was printed “F.U.C.K. - Fornication Under Consent of the King”.  Hence, that’s where the word “fuck” came from.  Isn’t it great to know this cool, fun fact?

The problem is that as cool and fun as the explanation may be, it is just plain wrong.  And it only takes a couple of clicks to debunk it.  With some help from snopes.com, (articles making basically the same point may be found here and here) let me count the ways:

1.  Acronyms did not become commonly used words until the 20th century.

2.  Just a moment’s thought reveals the absurdity of the notion that people would need their king’s consent to have sex.  Just for starters imagine the logistics involved, the time that the king would need to invest granting such consents and the impossibility of enforcement.

3. The word “fornication” has, since its origins, referred specifically to sexual activity outside of marriage.  Therefore, the supposed placard would be inaccurate when applied to married couples wishing to have sex.

The truth of the matter is that, unexcitingly perhaps, the word “fuck”, like most other words, crept into the English language from other tongues, most likely Germanic languages.  The Random House Historical Dictionary of American Slang cites the Middle Dutch “fokken” (to thrust, copulate with), the Norwegian dialect “fukka” (to copulate) or the Swedish dialect “focka” (to strike, push, copulate) or “fock” (penis).

Even today, when so much of the world’s knowledge is instantly available to most of us, we tend to take things at face value, particularly when they are presented to us in writing.  Canards are circulated and perpetuated, and misinformation runs rampant.  While universally available instant communication has accelerated the proliferation of disinformation, universally available information has not had the opposite effect.

And, by the way, although pig's orgasms do last up to 15 minutes, polar bears are not left-pawed.


Monday, September 3, 2012

Where have you gone, Neil Armstrong?

On Saturday August 25th, Neil Armstrong died at the age of 82.  

As I'm sure is the case for anyone over 50, one of my most powerful memories ever is watching a grainy, black and white image where I could barely make out a man in a huge space suit stepping off a ladder, and struggling to understand that what I was watching was taking place 240,000 miles away, on the surface of the moon.  Although the 8-year old version of myself did, of course, realize that the event was historic, I could not imagine that, 43 years later, it would still stand as (arguably) mankind’s most significant accomplishment.

It has been reported that Armstrong’s serene personality and low profile demeanor were among the reasons NASA selected him over his crewmate Edwin "Buzz" Aldrin to be the first man to step off the Lunar Module on that fateful July 20th.  If this was indeed the case, history bears NASA's decision out as amazingly prescient, for Armstrong, who thus became the public face of the most important feat ever performed by human beings, handled his post-mission role with elegance, aplomb and self-deprecation.  Always deflecting personal credit and instead emphasizing the team nature of the undertaking, Armstrong unassumingly faded into the sunset, never flying into space again and leaving NASA in 1971 for a teaching position in the Department of Aerospace Engineering at the University of Cincinnati.  Even his choice of university exemplified his aversion to pretension, since he felt that the faculty at the small aerospace department at UC would not be annoyed at his coming straight into a full professorship with only a master’s degree, as they may have been, for example, at the larger department at his alma mater Purdue.

In my mind, Neil Armstrong is as much a hero for the way in which he handled himself after his moon landing as for his role in the landing itself.  In the words of Charles F. Bolden Jr., the current NASA administrator, Armstrong “carried himself with a grace and humility that was an example to us all.”  Armstrong precisely embodied the prototypical test pilot / astronaut mentality, so brilliantly exemplified in movies such as "The Right Stuff" and "Apollo 13", where the emphasis is on quietly getting things done, and letting others do the celebrating.  Doing their job, not for the money, not for the fame, but because it's a job that needs doing.   The satisfaction of a job well done is reward enough for these men and women.

Neil Armstrong's legacy transcends his accomplishments and instead encompasses the idea that humility enhances heroism, while self-aggrandizement deprecates it.  In the face of today’s choreographed touchdown dances, appalling self-promotion and shameless exhibitionism, Armstrong’s quiet, dignified modesty should be held up as the way accomplishments ought to  be celebrated.  The first man to walk on the moon “...always believed he was just doing his job”.  All of us, from multi-million dollar athletes on down, can certainly benefit from that example.

One final tidbit from the Google Search blog that should leave us all awash in humility: NASA used about as much computing power for the entire Apollo program, in flight and on the ground, as is used for a single Google search today.

What have you and I done with the computing power at our disposal? 


Saturday, August 25, 2012

Partially Inverted Sequence



I’ve always thought that the evolution of communications over distance went something like this: smoke signals, carrier pigeon, semaphore, telegraph,  land line telephone, fax, cell phone, and then today’s digital communications (text, images, audio, video) over the Internet.  I was recently surprised to learn, however, that the sequence above is partially inverted!

The fax technology with which we are all familiar involves the transmission of scanned printed material (text and/or images) over telephone lines.  This particular technology was introduced by the Xerox Corporation in the 1960’s, obviously much later than the invention of the telephone in the 1870’s.  So we assume that the telephone came first, and then, much later, the fax.  However, that is not the case at all!  In fact, fax machines were around long before the telephone came along.  Early faxes simply used different methods of transmission.

From Wikipedia’s article on Scottish inventor Alexander Bain:

Bain worked on an experimental facsimile machine in 1843 to 1846. He used a clock to synchronise the movement of two pendulums for line-by-line scanning of a message. For transmission, Bain applied metal pins arranged on a cylinder made of insulating material. An electric probe that transmitted on-off pulses then scanned the pins. The message was reproduced at the receiving station on electrochemically sensitive paper impregnated with a chemical solution similar to that developed for his chemical telegraph. In his patent description dated 27 May 1843 for "improvements in producing and regulating electric currents and improvements in timepieces, and in electric printing, and signal telegraphs," he claimed that "a copy of any other surface composed of conducting and non-conducting materials can be taken by these means". The transmitter and receiver were connected by five wires.

Alas, Bain’s invention, although later improved by Frederick Bakewell, produced poor images and was not commercially viable.  However, then came Italian physicist Giovanni Caselli with his Pantelegraph.  From Wikipedia:

Caselli developed an electrochemical technology with a "synchronizing apparatus" (regulating clock) to make the sending and receiving mechanisms work together that was far superior to any technology Bain or Bakewell had.  The technology is relatively simple. An image is made using non-conductive ink on a piece of tin foil. A stylus, that is in the electrical circuit of the tin foil, is then passed over the foil where it lightly touches it. The stylus passes with parallel scans slightly apart. Electricity conducts where there is no ink and does not where there is ink. This causes on and off circuits matching the image as it scans. The signals are then sent along a long distance telegraph line. The receiver at the other end has an electrical stylus and scans blue dye ink on white paper reproducing the image line-by-line, a fac simile (Latin, "make similar") of the original image.


Caselli’s invention was implemented in “the first commercial telefax service between Paris and Lyon in 1865, some 11 years before the invention of telephones.”

So not only does the fax predate the telephone, but fax transmission media evolved from direct wire (Bain in 1846), to telegraph lines (Caselli, 1865) and to wireless via radio (RCA in 1924) long before Xerox introduced fax over telephone lines in 1964.  Interestingly, the radio facsimile developed by RCA in the 20’s “is still in common use today for transmitting weather charts and information to ships at sea”.

I had no idea!


Tuesday, August 21, 2012

Why?

Dictionary.com provides seven definitions for the word “why”: one as an adverb, three as a conjunction, two as a noun and one as an interjection.  All are interrelated and have to do with the reasons for things, except one.  The interjection.  This is the one that consternates me (and, hopefully, consternates you as well).  Why would a word whose meaning is related to the reasons for things be  “used as an expression of surprise, hesitation, etc., or sometimes a mere expletive”?


For example, why is the word “why” used in the following manner:


“Why, thank you!”

Or (from Charlotte’s Web by E.B. White, also of “The Elements of Style” fame):

“Why, how perfectly simple!” she said to herself.

Granted, the use of “why” as an interjection is scarce in contemporary conversation, and is sometimes used to depict dialogue as dated in TV shows, such as this SNL sketch, which uses such “Three Stooges”-style phrases as “why, I oughtta pound you”, to help set the scene in the 1930’s.  James Stewart uses “why” as an interjection repeatedly in 1946’s “It’s a Wonderful Life”.  But why did the word “why”, which dates back prior to the year 900 and derives from the Old English hwæt, wind up being used as an expression of mild surprise, beginning around 1510?

Although I was unable to find a definitive, authoritative answer, I did find plausible speculation from an apparently reputable source, user Cerberus, who is ranked 11th in all-time reputation on the English Language and Usage site from the Stack Exchange network of Q&A websites.  The esteemed Cerberus holds that “why”, used in a rhetorical sense, evolved into “a mere interjection of surprise”.  I will use my own sentences to explain the evolution below, based upon Cerberus’s logic:

First, used as part of a rhetorical question: “We’d better take an umbrella tonight.  Why is that?  Because it’s pouring!”

Then, used on its own as the rhetorical question reduced to its essence: “We’d better take an umbrella tonight.  Why?  Because it’s pouring!”

It makes sense that it then developed into a method of calling attention to the statement which follows it: “We’d better take an umbrella tonight.  Why, it’s pouring!”

Still a rhetorical question, but it seems likely that it would evolve from there to the interjection usage.  The only thing we know for sure is that, whichever way this happened, it took around 600 years!




Thursday, August 16, 2012

This.... sucks!

I admit it, I’ve been called a “nerd”.  I’ve taken it as a serious compliment, attaching to it its more contemporary meaning: “an intelligent but single-minded person obsessed with a nonsocial hobby or pursuit: a computer nerd”, and not its original meaning “a stupid, irritating, ineffectual, or unattractive person”.  (I know what you’re thinking: I’m attaching the wrong meaning, but let’s just leave it there, shall we?)  I’ve also been called a “geek”, and once again, am flattered because I take it to mean: “a computer expert or enthusiast”, and not: “a carnival performer who performs sensationally morbid or disgusting acts, [such] as biting off the head of a live chicken”, the word’s original meaning.  What were insults just a few years ago are now compliments.  Words evolve.

As Seth Stevenson puts forth in this great article on slate.com, (incorporating the references above), it’s time to divorce the wonderfully succinct descriptor “sucks” from its alleged vulgar roots.  I say alleged because, as Stevenson indicates, it is entirely possible that the word’s contemporary connotation did not even originate from fellatio:

...it's not even clear that sucks has naughty origins. We might trace its roots to the phrase sucks hind teat, meaning inferior. Or there's sucks to you, a nonsexual taunt apparently favored by British schoolchildren of yore.

Or, from an anonymous contributor to wiki.answers.com who apparently was there at the exact place and time the phrase came to be used as we know it today, we have this:

only [sic] one meaning fits the time that the phrase became a popular expression, not old England or [sic] something in the backseat [sic] of a ‘55 Chevy.  as [sic] a Vietnam vet i [sic] was there when and where it started.  we [sic] all used the expression “it sucks” to describe something really bad as in a ‘sucking wound’ is [sic] a open [sic] chest injury where the lung was exposed to air and sucking noise was what you might hear just before someone dies.  thus [sic] a sucking sound is a number 10 as in a bad, bad thing.

But regardless of whether the origin of “sucks” was vulgar, its usage today, as Stevenson so effectively argues, is simple, concise and emphatic.  And it joins “nerd” and “geek” as expressions that have taken on new meanings to such an extent that most people using them are unaware of their original meanings.  I cannot agree more with Stevenson when he concludes:


Personally, I wish sucks could escape from its slangy ghetto. It's a terrifically punchy little syllable, with that "k" lending it the proven Starbucks/Nike/Kinko's power of the "sticky consonant." And take heart, sucks-haters. Soon enough, another bit of slang will come along and gain entrance into our common language, and it will be vastly more offensive than sucks ever was.

Monday, August 13, 2012

The Cloud: Back to the Future

It’s hard to believe, but when I joined the workforce in the early ‘80s, the fax was just beginning to be used in business, rapidly replacing the telex which was, amazingly, still widely used at the time.  And, although the Apple II and IBM PC had already been introduced, and I actually had the pleasure of working on Visicalc on an Apple II for a glorious summer job a few years before, personal computers were few and far between at International Savings and Loan Association (ISL), the South Florida financial institution where I got my first real job.  “Dumb terminals”, though, were everywhere.

Relatively inexpensive hardware devices that were connected directly to mainframe computers, dumb terminals would, through archaic commands, make queries of the mainframe  and return results.  The terminals had neither processing power nor data storage capabilities of their own; they simply allowed users to access the data on the mainframe computer.  The mainframes were maintained by professionals at Data Processing departments (predecessors to today’s IT departments).  ISL wasn’t large enough to afford a mainframe computer of its own, so our dumb terminals were connected (via things like RS232 and leased lines) to Florida Informanagement Services, an Orlando company that provided data processing services for smaller savings and loan institutions.

Suddenly, personal computers began appearing on people’s desks!  These early machines, made by Apple, IBM and the so-called “clone makers” (Eagle, Compaq, Franklin, et al.) differed from dumb terminals because they had their own processing power and data storage capability, and, initially, were not connected to anything at all.  With these PCs, users had the entire computer at their disposal to do with what they pleased, but if they wanted to share their work with anyone it had to be done via floppy disks or printed copies.  But, for the first time, small businesses could actually afford real computers of their own.

The next step, of course, was networking.  By 1988 I had left ISL and started a new small business.  There were four of us at the office, each with a PC-compatible computer on his desk.  One magical day a couple of guys came out to our office, physically connected the computers to each other and installed Novelle Netware software on each of them.  Our computers were now able to communicate with each other, and we could install multi-user software that would allow all of us access to the data.  Small office servers could be put into place to make files and other services available to all users.  Amazing stuff.

Although online services such as America Online (later known as AOL) and CompuServe had been available since the late ‘80s, and I had actually been an early CompuServe user, in the early 90’s the Internet began its inexorable march toward commercial ubiquity in earnest, beginning with the emergence of email as a business communication tool, and later with the availability of the world wide web.   Computers suddenly needed not only to be connected to each other at the office, but also to the Internet.  This connection was initially achieved via dial-up modem at per-minute rates, later at flat monthly rates and, finally, via today’s broadband technologies, DSL, cable, and fiber.

Once computers and networks were permanently connected to the Internet, what we today call “cloud computing” became possible.  In fact, I was an early believer in this concept, and in 2000 moved my business’ accounting data from our office server to an online service known then as NetLedger (later known as NetSuite), where it remained until I dissolved the company just a few months ago.  To me, NetLedger was the harbinger of the future architecture of computing, and for the next few years I focused on only using technologies where the data was cloud-based and accessible from any computer, regardless of operating system.  So I changed my email protocol of choice from POP to IMAP, and later just accessed it through the web itself.  I moved all of my local files, including photos, to Dropbox, made extensive use of Google Docs (now Google Drive), moved my music to Amazon’s Cloud Drive, and used other services such as Evernote, Mint, and Gliffy.  I stopped making presentations on Powerpoint and Keynote, and instead used the online service Sliderocket.  By around 2006, basically the only software I used on any of my computers was the browser!  I made a conscious effort to live in the cloud, but my children and all of their friends naturally gravitated to the same concept without even knowing it, since from the beginning of their computing experience all of their “stuff” was online on services such as MySpace, Facebook, Twitter, Instagram, etc.

The way I see it, the architecture of computing has come full circle during my 28 year career.   I started out with a dumb terminal connected to a far-off server.  Then I got my own computer, with its own processing power and storage, but isolated from other computers.  Then my computer got networked and was able to communicate with nearby computers.  Finally, my computer became part of the network of networks that is the Internet, and this allowed it to access all sorts of servers all over the world; servers which, in my particular case and that of a growing number of users, store and process all of my data.  So my computers, albeit exponentially more powerful than all those that came before them, are basically acting as.... well, dumb terminals!

Obviously there is a huge difference between a monochrome terminal connected to a limited, single-purpose server running software that only specially trained users can use, and computers in all shapes and sizes (desktops, laptops, smartphones and tablets), with amazing high resolution full color displays, having access to literally billions of resources that users can take full advantage of with little or no training.  But in the sense of their overarching principle they are the same: devices used to connect to professionally maintained servers, where the data and processing power reside.  The main difference is that, although the servers are still maintained by professionals, unlike the terminal days of yore we users now have the power to access whichever servers we want, via web browsers or apps, and have the choice of whether to keep our data on our own devices, in the cloud, or both.  As an example, I could have written this essay on my computer using any of a myriad word processing apps, and set up my own web server to publish it, thus using the storage and processing power of my own computer.  Instead, I used various computers to write, edit and proofread the essay on Google Docs, and then published it on Squarespace.  The file never touched any of the computers I used to create it!

We’ve come back to the future, but now, we have choice.  The best of all worlds.  Or, at least that’s what we think today.  I wonder what we will think in another 28 years!


Thursday, August 9, 2012

eBooks and Inertia

I read my first electronic book in 1998, Stephen King’s outstanding novella, “Riding the Bullet, on my brand new Rocket eBook.  Mind you, although the Rocket was marketed as being “about the size of a paperback book”, it weighed 22 ounces, considerably more than, for example, this paperback which is probably about average and weighs in at 13.1 ounces.  The device got pretty hot after a while, and the screen was, by today’s standards, horribly low resolution and garishly backlit.  To get a book onto the Rocket you first had to download it onto your computer (after installing special software on it) over the not quite yet ubiquitous Internet, then connect the Rocket to the computer’s serial port (remember those?).   Comparing it to today’s $79 Amazon Kindle, which weighs less than 6 ounces, is like comparing a 1919 Model T to a  2013 Audi A8.

But how I loved that Rocket eBook!  After “Riding the Bullet”, I read many more books on it, including Dan Brown’s “Digital Fortress” and “Angels and Demons”, always enduring the curious looks, raised eyebrows and inevitable questions I would get from strangers who had no idea what that strange contraption in my hands was.  I, on the other hand, was sure that the end of paper books was clearly at hand, and that hardcovers and paperbacks would disappear in a matter of months.  Indeed, I kicked myself because I had actually thought of an electronic book reader years before and could not believe that the NuvoMedia people (makers of the Rocket eBook) had beaten me to it!  My vision, though, came before the advent of the Internet, so I imagined that Barnes & Noble and their ilk would sell tiny memory chips, containing books, that one would purchase and insert into “readers” similar to the Rocket.

I was far off on my prediction, though.  Not only did the Rocket eBook fail, but here we are, 14 years later, and just now, for the first time, net sales revenue from eBooks surpassed that from hard cover books in the first quarter of 2012, according to Galleycat.  The same article indicates that revenue from paperback books was, wait for it, still higher than that from eBooks during the same period.

How can it be that we are able to instantly download inexpensive books to all sorts of amazing, reasonably priced devices, have literally hundreds of books at our fingertips in a reader weighing less than 6 ounces, have our choice between gorgeous full color screens of various sizes or E Ink devices offering reading experiences almost exactly like reading a paper book, and yet, still, we read more books on paper than on electronic media, and our children still lug around those instantly obsolete bricks we call textbooks? Obviously it’s easy to think of a few reasons why this is the case: eBooks may disrupt the outdated business models of some major players in the industry, some people prefer the feel of paper books, sadly not everyone can afford even a $79 ebook reader.  In other words, (or, actually, word), Inertia.

But given the lightning fast proliferation of other technologies such as mobile phones and the Internet itself I’m still puzzled by the glacial pace of eBook adoption.  I’ll leave you with this comparison to consider:  I believe that the advantages afforded by eBooks over paper books are at least as important (if not much more) than the advantages that the automobile offered over the horse and buggy.  Moreover, the transition from paper books to eBooks is much less jarring than that from horse-drawn vehicles to self-powered cars. Yet although large-scale production-line manufacturing of affordable cars did not begin in the U.S. until 1902 by 1910 (only 8 years later!) the number of automobiles had surpassed the number of buggies.  My Rocket eBook was released in 1998, and it was by no means the first eBook.  The Amazon Kindle has been around for 5 years.  Yet, hardcover books and paperbacks combined outsold eBooks by 87.5% in the first quarter of 2012!  Almost twice as many paper books as eBooks!

Inertia indeed.


Saturday, August 4, 2012

How do you do?

I’ve always wondered about the formal greeting, “How do you do?”.  What, exactly, does this mean?  How do you do what?  What sense does this make?  It seemed like something was missing at the end of the phrase, or that the phrase was just wrong.

As it turns out, the phrase makes perfect sense when taken in the proper historical context.  It all goes back to the early meaning of the verb “do”, which has been used since the 14th century to mean “prosper, or thrive”.  So, in the 15th and 16th centuries, people would ask each other “How do you?” in the same sense as they ask each other today “How are you”?  However, at the time, the phrase “How do you” was used literally, as a query about health, and not as a greeting.  The change in usage from a specific query to a general greeting was gradual, and the greeting usage did not become widely used until the 18th century.

OK, but why the second, final “do”?

Here there is no absolutely definitive answer, but a probable one.  By the 18th century, when “How do you?” had become a commonplace greeting, there had been a change in the accepted form of expression.  For example, the medieval “wither goest thou” became “where are you going”.  So “How do you” sounded antiquated, and was “updated” to “How do you do”.

So the origin of the formal greeting “How do you do?” is clear, yet it begs the question, why did this query about another’s health evolve into a greeting?  Using a wide-ranging question as a greeting seems awkward, since the one querying is not really expecting an actual answer, but instead a meaningless positive response and a reciprocal query/greeting.  Imagine if, at a formal event, you’re asked “How do you do?”, and you actually respond truthfully about the state of various aspects of your health!  Even when we use an actual greeting, such as “hi”, “hello”, or even “howdy” (which interestingly is probably a contraction of the early “how do ye”), it’s invariably followed by an insincere query or two.  The greeting process thus becomes a greeting, followed by a few meaningless queries and rote answers before the actual conversation begins.

Interestingly, many science fiction aliens and future humans simply and elegantly greet by actually saying “Greetings”, implying that science fiction writers feel we will one day move on from our almost comical greeting rituals.  Only time will tell.

[I found most of the material regarding the origin of the phrase “How do you do?” in this article on  The Phrase Finder website.] 


Tuesday, July 31, 2012

Nexus 7

For many people, the iPad has replaced their laptop as their go-to mobile computer.  Although I have used my my first generation iPad extensively since I bought it shortly after it became available a couple of years ago, I am clearly not one of those people. 

I’ve enjoyed countless books and movies on my iPad, checked my banking and investment activity, answered email, browsed the web, and generally handled light computing duties that were previously the purview of one of my computers, or my iPhone.  However, although the iPad is more than capable of handling other tasks, I’ve preferred to handle them on my computers, simply because their keyboard/pointer paradigm, as well as their multi-window (or multi-tab) environment allow me to complete the tasks more quickly and elegantly than I ever could on the tablet.  So my iPad never replaced my laptop; it simply gave me a better alternative for certain tasks and activities, a painful alternative (thus hardly ever used) for others, and a non-alternative for still others.  My iPad became my MacBook Air’s constant companion in my computer bag and joined it and the iPhone constantly in my left front pocket as my tech triumvirate, with overlapping functionality for sure, yet none dispensable due to the presence of either of the others.

Enter the Nexus 7 (N7).

I had played around with a Kindle Fire, and even then was seduced by its size and weight.  However, its horrid interface and lackluster performance made it a non-starter.  The N7 is exactly what I hoped the Kindle Fire could be but clearly wasn’t.

The N7 is small and light, and the difference between its size and weight and the iPad’s is huge.  The N7 is a device you want to handle and carry around with you, and once you do, the iPad seems clunky, heavy and delicate.  Jelly Bean (Android version 4.1) runs great on the N7, and is the Android version that, finally, truly approximates iOS’s fluidity and responsiveness while maintaining the unparalleled seamlessness with all things Google that Android has offered from the start.  The N7’s screen, while sharp and bright, does not pretend to compete with the new iPad’s Retina Display, however, at least for me, it’s more than adequate.  Almost all of the apps I used frequently on the iPad are available on Android, and the functionality of the few that aren’t is readily available on the web.  The only drawback I’ve encountered during my first week with the N7 is the scarcity of movies and TV shows available on the Google Play store, as opposed to the iTunes store.  However, I’m smitten enough with the N7 that I’ve actually switched to watching TV shows that are available on the N7 while working out and traveling instead of grabbing my iPad. (Note for international travelers: make sure to load up on media before leaving the country, since Google Play, as opposed to the iTunes store, will allow neither purchases nor downloads from outside the US.)

Apple is strongly rumored to have a smaller iPad in the works, and Amazon is thought to be on the verge of coming out with a whole new line of (presumably) more capable Kindle Fire devices.  In the meantime, however, the Nexus 7 is my tablet of choice.  It does everything that I used my iPad for just as well, but in a much more convenient, sexy package.

Many people have replaced their laptops with iPads, and they and others perform all sorts of creative and productive tasks on their tablets.  The Nexus 7 will probably not appeal to them.  But to those of us who use our tablets mostly for media and data consumption and as supplements, rather than replacements, for our laptops and desktops, the Nexus 7 hits a spot much sweeter than the current iPad.