The 5 in 5: Innovations That Will Change Our Lives in the Next Five Years

“Computers are powerful enough now, but what do you think they’ll be able to do five years from now? This article featured on IBM has the answer—hear from some of the best and brightest and their predictions on how technology will integrate further into our lives in the following videos.”

 

via IBM.com

We think of the five senses as exclusive to living things

Processing sights and sounds requires eyes, ears and, most important, a brain—right? But what if your hardware shared your senses?

In the era of cognitive computing, systems learn instead of passively relying on programming. As a result, emerging technologies will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more. No need to call for Superman when we have real super senses at hand.

This year IBM presents The 5 in 5 in five sensory categories, through innovations that will touch our lives and see us into the future.

From IBM’s Chief Innovation Officer
Read what he says about The 5 in 5

 

Touch: You will be able to touch through your phone

 

In the 1970s, when a telephone company encouraged us to “reach out and touch someone,” it had no idea that a few decades later that could be more than a metaphor. Infrared and haptic technologies will enable a smart phone’s touchscreen technology and vibration capabilities to simulate the physical sensation of touching something. So you could experience the silkiness of that catalog’s Egyptian cotton sheets instead of just relying on some copywriter to convince you.

Learn more

 

Sight: A pixel will be worth a thousand words

Recognition systems can pinpoint a face in a crowd. In the future, computer vision might save a life by analyzing patterns to make sense of visuals in the context of big data. In industries as varied as healthcare, retail and agriculture, a system could gather information and detect anomalies specific to the task—such as spotting a tiny area of diseased tissue in an MRI and applying it to the patient’s medical history for faster, more accurate diagnosis and treatment.

Learn more

 

Hearing: Computers will hear what matters

Before the tree fell in the forest, did anyone hear it? Sensors that pick up sound patterns and frequency changes will be able to predict weakness in a bridge before it buckles, the deeper meaning of your baby’s cry or, yes, a tree breaking down internally before it falls. By analyzing verbal traits and including multi-sensory information, machine hearing and speech recognition could even be sensitive enough to advance dialogue across languages and cultures.

Learn more

 

Taste: Digital taste buds will help you eat smarter

The challenge of providing food—whether it’s for impoverished populations, people on restricted diets or picky kids—is in finding a way to meet both nutritional needs and personal preferences. In the works: a way to compute “perfect” meals using an algorithmic recipe of favorite flavors and optimal nutrition. No more need for substitute foods when you can have a personalized menu that satisfies both the calorie count and the palate.

Learn more

 

Smell: Computers will have a sense of smell

When you call a friend to say how you’re doing, your phone will know on the full story. Soon, sensors will detect and distinguish odors: a chemical, a biomarker, even molecules in the breath that affect personal health. The same smell technology, combined with deep learning systems, could troubleshoot operating-room hygiene, crops’ soil conditions or a city’s sanitation system before the human nose knows there’s a problem.

Leave A Comment!