First impression

Here’s why first impressions matter

The reason first impressions are so powerfully important starts with bias. Bias is prejudice for or against a person, an idea, the model of a car—practically anything. Positive bias underpins an optimistic personality, viewing events in their most favorable light. Negative bias is at play in many of the world’s –isms: sexism, racism, elitism, ageism, and so forth.

Confirmation bias is a special type of mental favoritism, defined as the tendency to interpret new information in a way that confirms an existing belief. From an evolutionary standpoint, confirmation bias helps the brain resolve complexity quickly. This cognitive sleight-of-hand is also what allows two people with opposite opinions on, say, climate change to view the same facts as supporting their own polar opposite viewpoints. 

But that’s not quite enough to explain first impressions. You have to consider timing. The order in which we learn information works its own quiet influence. We give more importance to information learned earlier than to information learned later. That early information forms our baseline opinion, then we evaluate later information against it. You can see this tendency in how you ignore a bad habit in an old friend but in anyone else it drives you crazy.

Taken together, these two mental quirks—confirmation bias and the tendency to prefer what we learn first—explain why first impressions matter so much. The first things you learn about a person anchor your opinion of them, and then you’ll tend to interpret everything you learn later as supporting your original opinion. 

You can dive deeper into the science behind confirmation bias in Entefy’s article “The hazards of confirmation bias in life and work.”

Businessman

Scenes from the birth of artificial intelligence

Dr. Rudd Canaday is Entefy’s Software Architecture Fellow. He is a co-inventor of the UNIX operating system and was a graduate student at MIT’s pioneering Computer Science & Artificial Intelligence Lab. Rudd shares some of his early AI experiences below.

After graduating cum laude in Physics from Harvard University, I went on to MIT for my graduate work. I started there in 1959, spending the first year and a half completing coursework for my Ph.D. qualifying exams. With that milestone behind me it was time to choose my Master’s thesis topic. It was then that I decided to focus my thesis on the area of artificial intelligence. I didn’t know then that the decision would place me right in the thick of historic advancements in computer science.

In 1961, artificial intelligence was just coming into its own, and MIT was the leader. Two men, Marvin Minsky and John McCarthy, both born in 1927, founded the MIT Computer Science and Artificial Intelligence Laboratory in the same year I entered MIT. These men were pioneers in the field and today are acknowledged as two of the founding fathers of artificial intelligence. 

Minksy was a scientist, inventor, and author who co-wrote the book Perceptrons, a foundational work in the analysis of artificial neural networks. He won the Turning award in 1969 and was inducted into the IEEE Intelligent Systems’ AI Hall of Fame in 2011. Minsky remained at MIT until his death in 2016.

McCarthy coined the term “artificial intelligence” in 1955 and organized the groundbreaking Dartmouth Conference in 1956 that launched AI as a field. In 1958 McCarthy invented LISP, the programming language that soon became the go-to language for AI applications. McCarthy left MIT for Stanford, where he founded Stanford’s Artificial Intelligence Laboratory in 1963. He remained at Stanford until his death in 2011.

Back in the day, MIT’s CS & AI Lab was a wild place. Many of the AI graduate students had their desks in the same room, and it seemed to me noisy and chaotic. Darts were always being thrown (at a dartboard) as were wadded up pieces of paper (at each other). All of this amidst lively discussions and arguments about machine intelligence. I don’t know how anyone got anything done in that atmosphere, but many of the earliest groundbreaking advances in AI happened there.

In 1950, the British mathematician and computer scientist Alan Turing had proposed a test, now called the “Turing test,” to determine whether a machine was intelligent. In the Turing test, you sit at a teletypewriter and converse with a person—or a machine—out of sight at another teletypewriter. If it is a machine that can fool you into thinking it is a person, then the machine is intelligent. Unfortunately, Turing introduced the idea by describing a party game in which you are trying to determine if the unseen person is a man or a woman, thus complicating his explanation by introducing the notion of gender, which for a while obscured the simplicity of his test.

The common belief in the CS & AI Lab at that time was that we would achieve true machine intelligence, a machine that could pass the Turing test, probably within five years, certainly within ten years. Many others believed it also. 

There were early signs that we were on the right path. At MIT during 1964 to 1966, Joseph Weizenbaum wrote a program called ELIZA to analyze natural English sentences. One of the scripts Weizenbaum wrote for ELIZA, named DOCTOR, simulated a Rogerian psychotherapist, who typically work with patients using a series of questions. ELIZA was not at all intelligent, as Weizenbaum was focusing only on analyzing English sentences. However, many people, including many psychotherapists, focused more on ELIZA’s potential than its very limited capabilities. They thought that machines could one day revolutionize the field of psychotherapy.

For AI history enthusiasts, ELIZA can still be found on the Internet. If it does not understand your input it typically replies with something like “Tell me more about your father.” Given advances to cognitive AI since then, it’s hard to believe that anyone could have considered it intelligent back then.

I’ve often thought about why machine intelligence, which we have yet to achieve, is so much harder than we thought 50 years ago. I think that a central issue is worldview, which seems to drive so much of human communication and understanding. Since we share a vast amount of common information with other people, we communicate in shorthand, taking for granted all of that commonality. That can explain why communicating across cultures is sometimes difficult. 

Today, developments in artificial intelligence are happening very fast. The AI system that won the game of Jeopardy against two humans in 2011 was interesting in large part because of its understanding of colloquial English. Jeopardy clues are rich in puns, red herrings, and wordplay. 

With advanced resources (from accelerated computation to efficient architectures to big datasets) now available to many AI systems in development, quality machine intelligence—that once upon a time at MIT we were sure was just 5 or 10 years away—may finally be on the horizon.

Age

Your sex determines how much you say “um” or “uh”

“So, um, let’s get started.” 

We all do it. You’re talking away and for a brief moment, a fact eludes you. Or you lose track of exactly what to say next. Or you’re nervous and just plain blank out for a moment. To buy yourself a little time, you throw out an “uh” or “um” or “like” or “sooooo.” There is a long list of verbal fillers like this. In fact, linguists study so-called “disfluency” and have found that as many as 6% of the words we use may be fillers that serve more as punctuation than to express meaning.

In fact, the study of these fillers is quite advanced. Wikipedia maintains a list of fillers in different languages that includes hundreds of examples. One academic study of 14,000 conversations in English found that men are more likely to use “uh” while women use “um.” That usage changes over time, so regardless of your sex you can look forward to saying more “uh” and less “um” as you get older. 

There are some basic tips for stopping the overuse of verbal fillers like “like”: relax, take a deep breath, don’t let your thoughts race too far ahead of your brain, and substitute silence for, uh, verbal fillers.Entefy’s enFacts are illuminating nuggets of information about the intersection of communications, artificial intelligence, security and cyber privacy, and the Internet of Things. Have an idea for an enFact? We would love to hear from you.

Brain

The brain’s complex relationship with social media [VIDEO]

We’ve all experienced the itch that strikes during those in-between moments when you have spare time on your hands. Without thinking, you’re reaching for your mobile device to flick through social media feeds in search of something new. Turns out, what’s happening in your brain in these moments is complex—and fascinating.
In this video, we look at the brain’s love/hate relationship with social media, covering everything from memory formation to dopamine cycles to tips on sustaining deep focus.
You can read more about the neuroscience behind social media here.

Chair

One week is all it takes to change your career forever

A friend of Entefy went off the grid for a “Think Week,” a distraction-free week of personal and professional goal setting. She shared her insights into how to pull off a successful and productive Think Week. 

Earlier this year, Entefy examined the myth of busyness that consumes many people’s lives. Our research revealed that most of us have more time in our days than we realize, but we’re often too “busy” to use those hours effectively. I’m guilty of exactly that. Between professional responsibilities, social plans, and life’s logistics, I often felt that my whole day was spoken for before I got out of bed in the morning. 

As I reflected on my constant busyness, however, I noticed patterns that contributed to the persistent feelings of exhaustion and frustration that came from my seemingly over-packed schedule. My days were peppered with interruptions – text conversations, social media checks, phone calls. Most of these were non-urgent yet I’d respond instantly, which meant it took me at least twice as long as necessary to complete any task. One distraction led to another and then another. I realized I could reclaim a good hour or two of my day, at minimum, if I learned to better manage my habits. 

But my brain craved the dopamine hits that come from checking social media or email every five minutes. Even when I swore I was going to focus, I couldn’t go more than 30 minutes without being seized by the impulse to do something more immediately gratifying. My life stayed busy – too busy. I needed to hit the reset button on how I approached my days and my priorities. I decided to take a Think Week.  

A week for thinking deeply 

In his book Deep Work, Cal Newport describes the famous example of J.K. Rowling checking herself into The Balmoral Hotel to escape the distractions of her daily life as she finished writing Harry Potter and the Deathly Hallows. But you don’t have to be an iconic author to benefit from time and space to reflect—as I proved when I took a Think Week this summer. 

My goal was to reflect productively on my life and career. I hoped that taking time out to consider where I’m at, where I’ve been, and where I’m headed would inspire insights and motivations that had gone dormant in the crush of everyday responsibilities. 

Think Week was an opportunity to disengage from my daily routine, and I found the experience rejuvenating. While it didn’t cure me of all my bad habits, it did help me identify and curb them. Remembering what it was like to not always be on my phone or consuming digital content gave me perspective on the appropriate place of email, social media, and the Internet in my life. 

If you’re seeking a way out of the busyness trap, a Think Week can get you started down that path. Here are five steps you can follow to make the most of the experience:  

1. Schedule your Think Week in advance

Marking your Think Week dates in your calendar will solidify its importance in your mind. It will also give you an opportunity to prepare your family, friends, and colleagues for your week of disconnection. A month out from your start date, create a list of everything you need to get done beforehand. This includes things like work projects, social plans you’ve yet to make, and visits to your parents. 

Find out what your boss or colleagues need from you before you go out of the office and be mindful of how many commitments you take on during these four weeks. Think Week will approach faster than you realize and you don’t want to be pulling all-nighters just to hit your deadlines. The transition will go more smoothly if you’re transparent with everyone about what they can expect during this time and honor your existing commitments before you go off the grid. 

2. Set boundaries 

It’s not enough to say you’re disengaging from your typical patterns for the week. Habits are tough to break, and unless you set boundaries, you’ll find yourself fielding calls and texts while you’re supposed to be in contemplation.  

Depending on your circumstances, you may want to allow contact from a limited number of people. But you should be clear with them and with yourself about what those allowed circumstances are. I told a few family members and close friends that while I wouldn’t be checking texts, email, or social media, I would leave my phone on so they could call me in an emergency. Then I blocked all social media, email, and text apps on my smartphone so I wouldn’t be tempted to check them throughout the week. 

Decide how much connectivity you’ll allow yourself on other devices as well. I opted not to watch TV during my Think Week because I feared that I would fall down the slippery slope into a Netflix binge rather than reading or brainstorming. However, I allowed myself Internet access and watched YouTube videos that I deemed educational or relevant to the topics I was pursuing during the week. Laying out the guidelines in advance gave the week structure, and it helped me overcome my impulses to indulge shallow pursuits while I was supposed to be contemplating the big questions in my life.  

3. Establish clear goals 

Know why you’re doing a Think Week and what you hope to gain from it. You don’t need to set grueling or unrealistic expectations for yourself, such as writing the first draft of a novel or inventing a life-changing product. In fact, it’s probably better if you avoid pursuing such specific outcomes. Paint your goals with broad strokes so that your Think Week has a guiding purpose, but leave room for meandering trains of thought and brainstorming sessions that might lead you to new goals entirely. 

I identified a few key areas in which I was seeking clarity, including long-term career goals and lifestyle changes. Keeping these targets in mind brought focus to my days, so I didn’t wake up each morning feeling aimless. I didn’t follow a set routine during the week, but I knew my purpose each day. By the time Think Week ended, I felt confident that I could put my newly clarified goals into action. 

4. Be spontaneous 

The beauty of Think Week is that you’re out of your daily routine, which can give rise to impulses and interests you’d usually ignore. During my Think Week, I spent large portions of my days reading on the couch. On writing intensive days, I liked to spend a few hours at a local coffee shop because the change of scenery helped me think. Indulging whatever setting felt right in the moment helped me gain momentum in my thinking and planning.  

Make a list of places that you enjoy visiting, and keep it on hand during Think Week. If you’re feeling stuck or restless in your home, go for a walk or wander around a bookstore. Perhaps you do your best thinking in the woods. Allow yourself the luxury of visiting places that inspire you, because spending time in these environments could spark connections that accelerate your thinking. 

5. Plan how to build Think Week findings into your life 

A day or two before Think Week ends, block off a few hours to plan your transition back to your normal schedule. By this point, you’ll have some idea of changes you want to make or goals you want to achieve. Without a plan for how you’ll work these into your routine, however, your Think Week energy will quickly dissipate and little will change. 

Perhaps you can scale back on some social commitments to make more time for the extracurricular work project you want to tackle. Maybe you need to delete certain apps from your smartphone until you’re in the habit of only checking them at certain times, allowing you to cultivate better discipline and greater productivity. Whatever your strategy, write it down and create reminders you can post around your home or workspace to keep your priorities top of mind. 

By the end of my Think Week retreat, I had reconnected with my big picture goals, what I want to achieve in my career, and what matters to me on a personal level. In the noise of constant connectedness, I had let passion projects fall by the wayside and become distant from people I value. After seven days of disconnection, I felt renewed in my priorities and had a clear path toward honoring them. I also had a game plan for having more productive, fulfilling days, which began with strict limitations on how and when I use social media and email. If you never break free of your daily routine, you might be aware that you need a change, but that change will never happen. You need to step outside the stream of input, responsibilities, and stimulation to learn how to flow more effectively within it. 

If the idea of a Think Week appeals to you but you can’t swing a full seven days, try a Think Weekend. Spend Saturday and Sunday offline, do some reading, and give yourself a chance to explore the thoughts and questions that have been lingering at the back of your mind. The act of disconnecting and devoting your attention to deep topics will feel revolutionary, and it could well change the way you structure your life. 

Clockwork

New traditions in traditional industries: 9 disruptive artificial intelligence technologies happening now

Entefy recently covered AI disruptions already underway in several industries, like utilities and manufacturing. These are industries where you would normally not expect to find advanced AI systems in place and having a disruptive impact. But as the capabilities of algorithmic systems develop rapidly, the surprise these days is finding an industry not experiencing the impact of artificial intelligence.

The story is no different in traditional industries like agriculture and banking. AI is hard at work untangling difficult, longstanding challenges; automating labor- and time-intensive work; and creating entirely new products and markets.

Here are some of the most impactful uses of artificial intelligence technologies in traditional industries:

1. Agriculture 

Automation systems are already relieving humans of dangerous farming jobs like picking lettuce, which can expose workers to potentially toxic chemicals. AI may also hold the key to the use of automated farming to solve the global food crises by ensuring better crop yields through targeted farming strategies. Drones can already collect data from vast swaths of farmland to identify which areas are thriving and which are at risk of failing. Some researchers are even attempting to teach drones to cooperate with one another, converging on areas with significant weed problems so they can unleash pesticides on the afflicted sections. 

2. Banking 

Advances in natural language processing (NLP) have made financial industry self-service systems capable of increasingly complex functions, such as onboarding new customers and assisting them with major loan decisions. Also, machine learning and optical character recognition are further simplifying banking by allowing people to submit financial documents through their smartphones. A customer can snap photos of the documents, and the system will automatically upload the images and extract the relevant information. 

3. Education 

Blended learning, in which teachers use technology to enhance traditional classroom environments, is gaining prominence in American schools, thanks in large part to AI. Technologies such as machine learning and NLP create the potential for AI-based lifelong learning companions. These programs would tailor their content to individual students based on the subject areas a child struggles with and which lessons are most effective. AI already serves as a kind of digital teaching assistant, taking over tasks such as grading homework and papers so that teachers can focus more deeply on lesson planning and student engagement. 

4. Food 

A powerhouse combination of machine learning and DNA sequencing could lead to food products that help people manage chronic disease. We’re not talking kale and blueberries here, either – these superfoods would be developed around specific peptides and how they impact diseases such as high blood pressure and Type-2 diabetes. The speed of AI-powered analysis could advance a field of study that has long grappled with slow results and extremely high costs, and could lead to breakthroughs in nutrition. 

5. Government 

AI is augmenting government work across the spectrum, from data entry to disease outbreak responses. Cognitive applications based on neural networks now analyze data anomalies that impact terrorist threat levels or signal shifts in the markets, events that require urgent government attention. Real-time tracking is also helping the government improve medical outcomes by identifying clusters of serious disease outbreaks. The military is developing technologies that can assess soldiers’ wounds based on data collected through wearable technology, enabling medics to prioritize treatments and treat urgent cases more swiftly. In more ordinary cases, sensors on street lights collect real-time data about traffic and maintenance needs and give citizens a heads up when their parking meters are about to expire. 

6. Healthcare 

Machine learning is helping doctors make faster, more precise diagnoses by studying medical records and contrasting images of healthy versus diseased organs. This technology could be used to solve the global caregiver shortfalls with better medical diagnosis and healthcare. In 2015 alone, China’s 80,000 radiologists saw 700,000 new cases of lung cancer. Fortunately, AI programs that can identify lesions and other disease markers are helping radiologists and doctors make earlier diagnoses and therefore prescribe treatment sooner. 

7. Law

The use of AI in the legal discovery process is becoming more mainstream. Technology is expanding into other areas as well, including predictive analysis and contract reviews. The former could prove especially valuable to companies as they determine whether to go to trial and assess their risks. Knowing the likely outcome of a case could save significant resources and shape better policies down the road. 

Although lawyers must be involved in contract reviews, legal industry machine learning platforms can decrease the time lawyers spend on those tasks by 20% to 60%, allowing them to focus on high-level tasks only humans can perform. Litigation strategist James Yoon said clients are still willing to pay a premium for complex, high-stakes legal services. “For the time being, experience like mine is something people are willing to pay for. What clients don’t want to pay for is any routine work.” 

8. Nonprofit

AI is literally saving lives in the nonprofit world. One suicide prevention hotline uses machine learning for the greater good to identify the phrases most often associated with emergency cases so it can prioritize those messages and respond faster to people in need. Another nonprofit, this one aimed at improving students’ writing, uses natural language processing to address users’ problems with sentence fragmentation. The organization had its system analyze 100,000 grammatically correct sentences, then used an NLP platform to break those down. Once the program learned to distinguish sentence fragments from complete thoughts, it showed an 84% accuracy rate on picking out fragments in students’ writing.  

9. Insurance 

How much privacy would you trade for cheaper insurance? Artificial intelligence is powered by data. And when it comes to data, often more is better. One distinctive aspect of the insurance industry’s adoption of AI is how these companies intend to collect their data. Insurers are turning to sensors that collect data directly from individuals, including technologies like in-home monitors, automobile transponders, and wearables. These new data sources open the doors to new products and pricing models, but whenever data collection intersects with a real person’s life, privacy questions emerge.

As you’ve seen from the variety and scope of this list, AI is being used to tackle challenges large and small—creating new opportunities for innovation at companies around the world.

Car crash

What’s more distracting while driving: texting or calling?

You’ve probably seen or heard a public service announcement about how dangerous it is to text while driving. Looking at your smartphone to tap out a message is a sustained distraction from what’s happening on the road. We’ve all been guilty once or twice of almost walking into someone while walking and texting; traveling in a car at highway speeds makes that behavior outright dangerous.

Data points to another dangerous smartphone-related behavior to watch out for while driving. According to a study of 3.1 million drivers by the U.S. National Safety Council, 88% of drivers use their smartphones for 3.5 minutes per hour of driving. The data doesn’t show that talking on the phone is itself dangerous—but making or answering a call is, because doing so requires the driver to look away from the road. And just 2 seconds of distraction increases the risk of an accident by up to 24x.

This is a strong argument in favor of next-generation conversational interfaces to operate devices. But until those technologies mature, be careful starting and ending calls while driving. Those seconds matter.

Data center

The soaring size and energy need of data centers [VIDEO]

Data centers are those giant facilities that house the servers that store the world’s digital bits. As more and more people come online globally, these facilities are expanding in number and size. This is causing all sorts of consequences, everything from insatiable energy consumption to a surprising link to how airliners are built. 

In this video enFact, we take a look at how the world’s thirst for data is shaping data centers. 
Read the original version of this enFact here.

Entefy’s enFacts are illuminating nuggets of information about the intersection of communications, artificial intelligence, security and cyber privacy, and the Internet of Things. Have an idea for an enFact? We would love to hear from you. 

Human

The Paradox of Prosperity (Part 4)

Entefy’s research into the complexity of modern life concludes below. In Part 1: The bygone Golden Age, we outline the paradox of prosperity, the observation that despite widespread evidence of the world’s progress, our individual experience of life is often something quite different. In Part 2: Defining modern complexity, we analyze the aspects of life today that create the experience of complexity and consequentiality. In Part 3: The general erosion of confidence, we look at the evolution of our collective trust in social institutions. In Part 4: Family, friends, and community, we analyze the structure of social circles and changes to income and education. The report concludes by examining how digital technologies can contribute to meaning and fulfillment in modern life.

Part 4: Family, friends, and community

Picture two friends. They share similar tastes and preferences, world views, and life experiences. Individually, each one shares similar bonds with other people, friends, family, colleagues, and so on. And each of those persons has their own bonds. These bonds extend outward to include practically everyone in the world. Each person is unique, but when the shared viewpoints are extended outwards far enough, you have what we call society. 

Today, after nearly 50 years of rapid social evolution, the U.S. has far more variety, diversity, and complexity than ever before. Here is another face of the complexity we’ve been describing: the changes to traditional family structures, friend networks, and communities. The rise of single parenthood and the decline of the middle class are two aspects of these changes.

The U.S. in the 1960’s had more elements of a consistency in family and friend structure. For example, 87% of families included two parents, according to Pew Research Center. However, by 2014 that figure had declined to 61%. Similarly, in 1985 78% of survey participants indicated having more than 1 friend. Whereas today, that figure is much lower at only 53%. 

The number of households with only a single person has risen from 13% in 1960 to 28% in 2016. The number of people with many friends is small (15%) and is down 72% from where it was in 1985. While post-Internet communication technologies created the capacity for broad virtual relationships, the data nonetheless describes a very different story of personal connection, or lack of it. When you factor in that labor force participation is declining, more and more people are essentially disengaged, not just from friends and family but from meaningful interaction with others.  

Next is money. While U.S. real median household income is high by global standards, it has also been relatively stagnant for the past twenty years. Importantly, this stagnation comes after fully 50 years of steady increases. The reliability of these income increases and the fact that they were spread quite widely across social tiers was a major factor underpinning the “American Dream” itself. 

The importance of increasing income is not solely monetary. Higher income creates options for the wage earner and helps buffer households from unexpected shocks and setbacks. In contrast today, a static income, even a high one, represents a decline in expectations alongside the practical reduction in capacity to address unexpected setbacks.  

There are complex aspects to educational attainment as well. The percent of the population with a college degree has been rising steadily since 1940. That figure is higher than ever today at 32.5%. While a more educated workforce is a good thing from a societal perspective, from an individual perspective, that simply means more competition. When 5% of the population had a college degree, that represented a near-guarantee of high-paying, opportunity-filled work. At 32.5%, the degree represents a less valuable credential and less of a guarantee for future prosperity.  

As we saw with the general erosion of confidence in social institutions, the true impact of these changes is best understood with some distance. Single data points do not tell the whole story. Taken together, rapid changes to the nature of our connections to friends and families have created for many people a sense of…something. Nostalgia for a better yesterday. Confusion and distrust towards today. Or in some cases the widespread anger that propelled an outsider to the Presidency in the 2016 election.

Of the facets of complexity that we have covered in this research, the changes to our social fabric are the most personally felt. But they follow the same pattern we’ve described in other areas. Rapid change introduces complexity to our lives, and that complexity serves as a headwind to our efforts to increase our productivity and, in turn, our prosperity.

But as we stated at the outset, our goal is not simply to quantify complexity in our lives, but discover the way past the paradox of prosperity towards better living. That path starts with awareness and knowledge of how to deal with the tectonic shifts taking place in technology.

Will tomorrow’s technologies transcend the paradox?

The thing with complexity is that it’s complicated. Complexity carries a cognitive burden that we experience whenever we observe the rapidly evolving world. The faster the world changes in terms of social conventions, technology, and commercial needs, the more we have to prepare for change. Every iteration of change is both an opportunity to excel and a risk of failure. No wonder there is so much anxiety.

So what is to be done?

One solution to the paradox of prosperity is to merely accept a simpler life even though that’s likely to entail lower productivity. The other solution is to master advanced technology to help bridge the gap between increased complexity and increased productivity. 

Historically, new innovations, in particular communication technologies that enable new forms of connection and collaboration (from telegraph to radio to the Internet), have correlated to dramatic growth in global GDP. The advent of the telephone, for instance, led to widespread behavior change: a certain portion of a person’s time was now dedicated to connecting on the phone with friends, family, and customers. 

Today, the “phone” we carry represents a massive focus of our time, for not only basic communication but also reading, information retrieval, photography, games, music, and so on. As with so many aspects of modern life, today’s technologies also represent a massive increase in complexity over what existed only a few years ago. First there is disruption, then widespread benefit. 

What is interesting is that even though technology can cause complexity, it is also our best resource for simplification. We are only decades into the Internet era, while the impacts of artificial intelligence and robotics have barely been felt. Developments in computing, smart machines, and automation carry tremendous potential to directly improve people’s lives. 

In recent years, technology has increased personal capabilities and productivity even as it has increased complexity. To take just one example, getting all our devices to seamlessly talk and sync with one another is a daunting task for most people. But there is a predictable cycle here. Today we’re in the early stage of the adoption of a long list of new technologies—from smartphones to artificial intelligence to social media to robotics. And, consistent with past adoption cycles, we’re in the difficult, sometimes confusing, disruptive stage. New capabilities carry a cost in complexity as we figure out how to fit them into our lives.

Current-generation technologies are powerful—powerfully capable and powerfully distracting—and in need of refinement. The good news is that we are at the cusp of a shift where that very technological capability starts to bring greater simplicity to our lives, as a new generation of technologies hides the technical complexity away behind screens and shells. Which leaves us with much simpler interfaces that make getting more accomplished faster and easier. 

All of this relates directly to the paradox of prosperity. The generations-long changes we have described, the day-to-day complexity they have created, are at a tipping point. Much of this complexity can be untangled with advances in a new generation of technologies. When that refinement comes, all of the time we spend every day managing devices, fiddling with settings, troubleshooting problems, and so on will be freed up to spend however we want. 

Advanced technology that insulates us from complexity will free up that time even as it helps us make better choices, ultimately empowering people to live and work better. And there we see the narrow path out of the paradox of prosperity: advances in artificial intelligence, robotics, biotechnology, and other technical areas that support a streamlined and simplified life, a life lived with meaning and purpose, that is savored not rushed, deep not shallow. Where we can be the best versions of ourselves.

The Paradox of Prosperity (Part 3)

Entefy’s research into the complexity of modern life continues below. In Part 1: The bygone Golden Age, we outline the paradox of prosperity, the observation that despite widespread evidence of the world’s progress, our individual experience of life is often something quite different. In Part 2: Defining modern complexity, we analyze the aspects of life today that create the experience of complexity and consequentiality. In Part 3: The general erosion of confidence, we look at the evolution of our collective trust in social institutions. In Part 4: Family, friends, and community, we analyze the structure of social circles and changes to income and education. The report concludes by examining how digital technologies can contribute to meaning and fulfillment in modern life.

Part 3: The general erosion of confidence

The importance of confidence and trust at a cultural or societal level has become the subject of intense study among economists and sociologists. It has become increasingly clear that societies and cultures with high levels of confidence in their institutions are also societies and cultures with high levels of productivity and development

Among the mechanisms by which confidence contributes to economic, societal, and personal development is efficiency. When there is high confidence, a handshake can take the place of a contract, saving time and money. The lower the level of confidence, the more precautions have to be taken, the more you have to focus on managing risk.  

Gallup has been monitoring trust and confidence levels in institutions for decades. They cast their survey questions in terms of confidence: “Please tell me how much confidence you, yourself, have in each one — a great deal, quite a lot, some, or very little?”. They started asking these questions in 1973, and have added additional institutions over time. The results are insightful and tell us a great deal about how complexity has impacted our day-to-day lives.

There are some bright spots. Today, we are more confident in Small Business, the Police, the Military, and the Scientific Community than when polling began decades ago. Confidence in Small Business has always been high and remains so. While the Military scores the highest confidence at 73%. The Scientific Community, on the other hand, elicits only a moderate degree of confidence; but confidence in the community has risen 11% since 1973.

Confidence is eroding in most of the institutions covered in the survey. These include institutions that are part of most people’s everyday lives like Banks, Television News, and the Medical System, as well as government and public services like Congress, the Presidency, and Public Schools. 

Today, Congress earns the lowest confidence score at just 9%, a 79% erosion from when the survey began in 1973. Television News is not faring much better, with confidence falling from 46% in 1993 to 21% in 2016.

In 1993, new categories such as the Medical System and the Criminal Justice System were added to the survey. The Medical System received an 80% confidence vote, the highest of all categories in the survey. That number has since declined by slightly more than half to just 39%. In the same year, the Criminal Justice System scored the lowest at only 17%, though it has seen modest increases since, to 23% in 2016.

To sum it all up in one stat, satisfaction with the way things are going in the U.S. is only 27% and that is nearly half of what it was in 1989. 

In sum, we know that trust and confidence are critical elements in a healthy society and economy and that high levels of trust and confidence contribute to productivity and overall prosperity. As important as these attributes are, the past several decades have seen overall declines in confidence in most of our institutions. 

Which brings us back to the idea of complexity. The everyday experience of this data is troubling. Day to day, we have little confidence in the veracity of a speech in Congress, or the fairness of a judicial ruling, or the motivations of the bank that holds our savings. So our experiences of these interactions are strained and overly complicated. 

Part 4 of this report concludes Entefy’s analysis with a look at the evolution of friends, families, and communities; and a discussion of how digital technology can transcend the paradox of prosperity.