Opinion & Analysis

Informational deluge has a silver lining

Informational deluge has a silver lining
 
Informational deluge has a silver lining

I will for purposes of this piece, and a couple or more others to follow, call it the Information Age.

It is an age characterised, fundamentally, by easy and virtually instantaneous access to information and any conceivable body, or snippets thereof, of knowledge. It is an age in which all manner and aspects of information is obtainable just by the press of a button, the click of a mouse, or by way of brushing a finger along or across a touch sensitive display screen.  

The Information Age dawned sometime in the 1970s, though it was not until the 1980s through the 1990s that it became mainstream in the ranks of the broader global citizenry. It is marked by the increased production, transmission, consumption of, and reliance on information.

To be part and parcel of the Information Age makes for very exciting times. This is the age of the computer in its various guises such as desktop, notebooks, and iPads; wireless communication; the Internet; digital satellite TV; Skype; Google; Facebook; WhatsApp; and live-streaming. These constitute only a few of the information access breakthroughs of our day.

The so-called Millennials (the generation that was born between 1981 and 1986) and Post-Millennials (the generation that arose in 1997 and thereafter, also known as the Digital-ites or Generation Z) may take such electronic devices as the mobile phone, the DVD, and the memory stick for granted, but scarcely do they know that they were in a manner of speaking born in an ultra-modern age.

I myself belong to what has been dubbed as Generation X – the bracket that came along between 1965 and 1980 – and trust me ours was the era of the Ancient of Days that is the powerline telephone; the practically obsolete VHS cassette reels and floppy disks; and the totally outmoded gramophone players and vinyl LP records.   

Some ranks of the senior citizenry – those who came into the world just ahead of the Baby Boomers (people born between 1946 and 1964), still needs a bit of help on the laptop keyboard and in surfing the worldwide web of the Internet since these leaps in information storage and gathering advances came into vogue when they were well in the noon of their lives.    

Has the Information Age run its course? There’s an unassailably persuasive case for the argument that even as I write, the Information Age is incrementally giving way to what is generally termed the Experience Age. 

Such a shift is inevitable considering this chain of causation: the Agricultural Revolution paved the way for the close-at-heel   Industrial Revolution in the late 19th century; the Industrial Revolution in turn ushered in the Industrial Age; the Industrial Age provided the context for the emergence of the Digital Revolution in the mid-20th century; and the Digital Revolution made it possible for the Information Age to flourish relatively latterly.

The Information Age is based on information gathering facilitated by computerisation. A pundit of, and a bit-player in, the trends of the Information Age posits thus: “In the Information Age, the start of communication was information. On Facebook, you type into a status box, add metadata such as your location, and select from a hierarchy of emotions for how you feel. This information-first approach is also visible in Facebook’s feedback mechanisms — six pre-selected reactions with threaded commenting”.

In the Information Age, human work, play, and interaction patterns are in large measure impacted by systems for processing and distributing information. A defining attribute of a tool of the Information Age such as the wildly popular Facebook, for instance, is accumulation. This is whereby one’s identity is the sum total of all that he or she has saved on his or her data storage device  – texts, photos, videos, audios, web pages, etc.

With the changing context of our online interactions, the status box is becoming less fashionable as we progress from collating information for its own sake to being active participants in its effectuation. Welcome to the Experience Age!

Snapchat is a trendsetter in the Experience Age. If the Information Age is underscored by Facebook as a social media platform, the Experience Age is being punctuated by Snapchat, a multimedia messaging app that is already making its mark and as of the last count boasts a market capitalisation of about $23 billion, though Facebook Live and Periscope are not that far behind in pegging a similar claim.

In fact, there’s as we speak an Experience Age domination race of sorts which is characterised by a layering of technology we can aptly call the Experience Stack (see accompanying illustration). As you ascend up the stack, the layers assume a much more pronounced virtual reality (VR). With further innovations such as Airtime and A New Morning, we’re headed for the stratosphere of experiential interfacing and interaction. 

The Experience Age is certainly an age of certain technologies, microcomputers, mobile sensors, and high-speed connectivity, not very much unlike the Information Age, its overlapping predecessor. It engenders a convergence of our offline and online identities.  It’s more about continual self-expression than accumulated information – the reason why Facebook’s investments lately bring live video, 360-degree cameras and VR as products all into a single portfolio and why the likes of Google and Alibaba staked up to $2 billion into Magic Leap, an Augmented Reality (AR) start-up.

Somebody sums up the staggering potential and possibilities of the Experience Age best in these words:

“The Experience Stack will drive new products to market faster as each layer can grow independently, while at the same time benefiting from advancements in the layers below.

 “An example of this phenomenon is high-speed 3G enabling Apple’s App Store, which together advanced mobile as a whole. The best products of the Experience Age will be timely new applications that leverage step-change advancements in bottom layers. 

“Given that some layers are still nascent, tremendous opportunity is ahead.”

If a certain fear is in order with regard to the Information Age, it is information overload. Too much of anything may militate against its efficacy in ways.

If you can easily find every nickel and dime on any subject, you are certain to support any view you are trying to advance or sell, or any hypothesis you wish to test. Also, if every bit of information is within reach, it will tend to make people intellectually lazy, to the extent where thinking and judgement takes a back seat. It explains why reputable institutions of learning run submitted theses or dissertations through authenticational software, such as Turnitin, for instance, to make doubly sure the work was not lifted word for word from quality efforts of others.   

The Information Age came with multiple sources of stimulation, such as smartphone, tablet, and kindle, for instance. As we juggle these gadgets, as if we’re trying to perform tricks of sleight of hand, we compromise the capacity to focus and concentrate on a specific activity. The tragedy of the Information Age is a deficit of attention, making attention the scarcest resource of our day.

Often also, we are lightning-quick to access information, but we struggle to make sense of it or apply it in the best-fit situation.

One savant identifies two contrasting risks in respect of too much information.

“One is that we become obsessed with getting to the bottom of a problem, and we keep on digging, desperate to find the truth but taking forever to do so,” he writes. “The other risk is that we become overwhelmed with the amount of information out there and we give up: we realise we cannot actually master the issue at hand, and we end up falling back on a pre-existing belief.”

Information is not the be-all and end-all. The input of the human mind in the final analysis is crucial. Says the same savant quoted above: “The most successful companies in the future will be smart about scanning for information and accessing the knowledge of their employees, but they will favour action over analysis, and they will harness the intuition and gut-feeling of their employees in combination with rational analysis.”

A case in point is Jeff Bizos’ Amazon.com, whose phenomenal success is attributed in the main to its capacity to make the big calls based on judgement and intuition.

In 1949, British author and essayist, George Orwell – he is of Animal Farm fame – wrote a futuristically-oriented fictional work he titled 1984.  In it, Orwell shone a light on a global despotic power   that used technology to survey and spy on the activities of mankind in a region of the world known as Oceania with a view to controlling and keeping tabs on society round the clock.  

Orwell, who was suspiciously prescient, foresaw such a scenario 35 years from the time he wrote 1984, but it was not until the turn of the century that the Big Brother dispensation broke upon us. In 2011, for instance, the Chinese government came up with WeChatSpy, a proficient monitoring application that not only uses phone-tracking software but also flags every electronic transaction one does by way of a debit or credit card. In other words, the app is capable of pinpointing your every whereabouts.

WeChatSpy is underpinned by over 200 million CCTV cameras that watch over every inch of China’s territory, so that in the event, for example, that you test positive for COVID-19, your trail can easily be picked up. Everybody you rubbed shoulders with lately will be identified, contacted, and ordered to undergo the test too, whereupon they will either be quarantined or forced into self-isolation.   

Other monitoring applications, which have since sprang up include Spy Phone App, Snapchat Spy, Facebook Spy, Twitter Spy, Instagram Spy, Tinder Spy, etc. These are ostensibly not meant to nose around into people’s private affairs: they can be used civilly, such as for parental control and monitoring employees to reinforce productivity or deter pilferage. 

The days of utter secrecy or iron-clad anonymity are over folks. Big Brother is watching our every step. The only good thing about it all is that we can put the same intrusive technology to positive and productive use. 

Everything in life has its nether   aspects, a case in point being atomic energy, which can be used to obliterate a nation state (negative) or generate electric power (positive).

Ultimately, it is the use to which you put a piece or form of technology that really matters and not the technology itself per se.

LESANG MAGANG