Light bulbs

Too many cooks in your kitchen? 5 ideas for better collaboration

Have you ever sat in an overcrowded conference room and had that old expression, “Too many cooks in the kitchen,” come to mind? Modern business is defined by increasing amounts of collaboration, a reality encouraged by open floorplan offices, mixed teams of onsite and remote workers, and participation in multiple projects with multiple teams. To name a few.

Yet in many ways, we each have a different view of just what “collaboration” is. And not always a positive view. Collaboration researchers noted that “Teamwork all too often feels inefficient (search and coordination costs eat up time), risky (can I trust others to deliver for my client?), low value (our own area of expertise always seems most critical), and political (a sneaky way of self-promoting to other areas of one’s firm).”

Below we’re sharing research and advice for collaborating in general, and for setting up effective meetings specifically: 

  1. Keep the group small. While Dunbar estimated that the limit for stable social relationships we can maintain is 150, the optimal team size is much smaller. There’s no one magic number, but research into optimizing small group size suggests that collaborative groups work best when they have around 5 members. Adding additional members above 5 can improve output but at the cost of adding management overhead. Another study of team size found that smaller groups “participated more actively on their team, were more committed to their team, were more aware of the goals of the team, had greater awareness of other team members, and were in teams with higher levels of rapport.”
  2. Make sure goals are clear. Limiting the number of people in the group forces us to choose the best people for the job, and choosing the best people means considering exactly what the goals and objectives are. Having people sit in on meetings that are irrelevant to them or outside of their skillset wastes time; letting people know what the point of the meeting is and what their role is within it helps keep the group on track and focused. 
  3. Don’t forget alone time. When a group’s goals are clear and members’ roles are spelled out, individuals can branch off for independent work. Schedule time to spend together and leave all communication until such time—people are less likely to enter a state of flow if they’re getting distracted too often. 
  4. Ideas first, critiques later. To reduce the likelihood of people focusing on the first piece of information presented or feeding into the predispositions of others, have people form their own hypothesis and ideas before any sharing occurs. During meetings, have people write things down as opposed to calling out or raising hands, so that they won’t be influenced by what others do and say. 
  5. Challenging work encourages flow. Professionals generally work better when their goals are just within their abilities. Like Goldilocks, not too hard, not too easy. Walking this fine line also allows us to enter the highly creative, productive state of mind known as “flow.” Flow is not solely an individual phenomenon, but possible in groups as well. The state of flow is characterized by intense focus and the blurring of time, and it occurs when a challenge is not too easy to become boring and not too difficult to be impossible. This is more easily achieved when the right members are paired to the right projects, and when the group is small and cohesive enough to trade ideas and provide feedback while staying focused and on track. 

Keep these 5 ideas in mind before scheduling your next meeting and watch your team’s productivity soar.

Private data

Collected, bundled, and sold: your sensitive private data [SLIDES]

Why does it matter that so many apps, webs services, and devices collect data about practically every aspect of your life? The answer is partly because it often happens without your knowledge or permission. And partly because once data is created, it can live on indefinitely—and who’s to say where the data that you consider sensitive might end up? 

You can read an in-depth look at these data trackers in Entefy’s article, Collected, bundled, and sold: your sensitive private data.

Happy Thanksgiving

Thanks, with a heaping side of giving

Surrounded by friends and family, delectable dinners, and companionable conversations, everyone at Entefy will be taking inventory of where we are today and where we hope to be tomorrow. 

In these moments of reflection, it’s all too easy to focus on the many challenges the world faces today. It can be much harder to recognize the many areas where things are going well. And the many signs of a better tomorrow just over the horizon. It is for these bright shimmering signs of hope and progress that we are thankful.

So much can be accomplished with more giving by more people and we believe it is giving that truly expresses our thanks. Companies giving back to the people and places that make them successful. Individuals giving a little bit of ground to ensure a fair compromise and a respectful listen. 

From everyone at Entefy, we’d like to extend to all our warmest wishes for a bountiful Thanksgiving season. 

The Entefyers

Network cables

The digital chasm: Ageist design in technology is a problem for everyone

If you’re above the age of 25 and you use Snapchat, there’s a good chance a teenager taught you how to navigate the youth-centric disappearing photo app. Snapchat became wildly popular after its 2011 debut, especially among users on the younger end of the millennial spectrum. But the app confounded most older users, giving even tech-savvy 30-somethings a taste of what it’s like to use tech products that cater to youth. Snapchat is just one example of the tech industry’s disregard for the needs and preferences of many older users. And that disregard is a big problem for everyone.

We’re all familiar with the trope of parents and grandparents relying on their kids to teach them how to turn on their computers or connect their smartphones to Wi-Fi. But beneath the joke lies a more substantial issue. Technology is often designed for the young, by the young, and the rest of the population is left to play catch-up. 

That’s a problem, because as everyone becomes more reliant on technology for everything from buying groceries to accessing medical care, poor user experience design excludes people from important services. It’s also short-sighted from a business perspective, since a middle-aged executive likely has far more buying power than a smartphone-savvy teen. Yet many of our most commonly-used platforms seem to disregard usability factors for all but the youngest consumers. 

The digital chasm

In a given day, Americans check their phones 8 billion times a day, and the average person will spend five years of their lives on social media. Given all of that connectivity, you’d assume that the mobile apps that consume so much of our time are created with usability as the number one focus. Yet across many digital experiences, it’s clear that that’s not the case – especially if you’re older than 35.  

At first glance, the youth-focused nature of tech design makes some sense. Research indicates that 18-34-year-olds show the highest rates of app downloads, time spent in-app, and device usage overall. Because they’re connected to a wide range of peers through social networks, they’re more likely to learn about and embrace new apps and digital products faster than older consumers.  

While baby boomers and seniors are unlikely to outpace millennials on screen time (research shows that young users spend more time on their smartphones each day than they do with other people), smartphone and social media use is on the rise among older cohorts. One third of senior citizens use social media to get their news and entertainment, and 40% own smartphones. 

Yet many older adults feel excluded from current technology trends. For instance, 77% percent say they would need help setting up a tablet or smartphone. Once they get online, however, many are quite active digitally and feel positively about their web experiences. Improved usability could extend the benefits of the Internet and mobile devices to a much wider range of people.

Despite this diversity, many tech platforms appear to be designed with little regard for older users. A digital native might intuitively know how to navigate the overwhelming number of features on a given platform. But millions of smartphone and social users didn’t grow up with these technologies, and designers aren’t making it any easier for them to adapt. 

Ageist design on major platforms

Consider Facebook. When you open the page or app, you’re bombarded with activity from your friends and family. If you don’t get sucked into your constantly updating news feed, you may find yourself inundated with group and event listings, chat notifications, and friend recommendations for people you may or may not know. It’s easy to spend hours wading through all of the content and options. And many people do. The average Facebook user spends 50 minutes per day on the platform, nearly six hours per week. That’s 42.6 billion man hours per month given the platform’s 1.65 billion monthly active users.

But is the design user-friendly for all? Not so much. That’s why you see older users unwittingly posting personal messages as status updates or announcing to the world that they don’t know how to copy and paste. All of which indicates real usability problems – especially when you consider the privacy implications. 

Even people who have used Facebook since its launch in 2004 gripe about its confusing privacy settings. As users become more aware of how their social posts impact their employment prospects and even their data security, they want more control over who sees what and when. Facebook has responded to demands for increased privacy control, but usability challenges persist for users already overwhelmed by the interface or how to manage their accounts. 

But Facebook isn’t the only company that seems to take a less-than-inclusive approach to usability design. In recent years, critics have taken Apple to task for emphasizing aesthetics over functionality. Two of Apple’s earliest designers lambasted the company, which they say was once “a champion of the graphical user interface,” for abandoning “the fundamental principles of good design.” They say that in its quest for beautiful design, Apple created “obscure gestures that are beyond even the developer’s ability to remember.” And if a developer doesn’t find them intuitive, what chance does a late-adopter stand?

Then, of course, there’s Snapchat. One writer described his experience of attempting to use the app this way: “I end up flustered and sweating, haplessly punching runic symbols in a doomed bid to accomplish the basic task of viewing my friends’ messages before they expire. Snapchat, in short, makes me feel old.” Another put it even more succinctly, writing, “Snapchat has an age limit. But it isn’t set by asking you for your birthday. It is set through an interface that is so confusing you need to be young to get it.” 

Not every app is going to appeal to every age group, and there’s nothing inherently wrong with catering to young users. But shouldn’t interfaces prioritize universal usability? 

Inclusive design improves lives 

Although baby boomers and senior citizens are slower to adopt new tech than their younger counterparts, it’s often not for a lack of desire. Data shows that older people are increasingly embracing new technology, especially products that accommodate the challenges of aging, such as vision impairment and arthritis. However, users over the age of 60 are reluctant to purchase smartphones because the “smaller screens and complex menus” are more difficult for them to use. 

A move toward inclusive usability design has long-term benefits. Terry Bradwell, chief enterprise strategy and information officer at AARP, weighed in saying “As long as tech changes, there will always be a divide of some sort.” Most millennials grew up with constant connectedness, so while they’re not quite digital natives, they’re adapting OK – for the time being. But as we’ve witnessed in the past 20 years, technology is advancing at an unprecedented rate. Unless we begin prioritizing usability for all ages now, Snapchat will only be the tip of the iceberg in a generational technology divide. 

Chess

“Intrapreneurship” isn’t a typo. It’s your company’s best defense against competitive disruption.

The idea of intrapreneurship is attracting more and more attention from professionals and companies alike. And for good reason. Intrapreneurship represents a new way of thinking about innovation inside organizations.

If you haven’t encountered the term before, here’s how intrapreneurship relates to the more familiar entrepreneurship. Entrepreneurship encompasses all of the activities related to dreaming up and launching new business ventures. Intrapreneurship, by comparison, represents those same activities taking place inside a company, focusing on areas like identifying new market opportunities or improving policies and processes. In a word, intrapreneurship is innovation and change-making focused on improving the competitive position of an existing company.

Below we’re sharing 11 core ideas about intrapreneurship and intrapreneurs. This list is a mix of advice for professionals interested in the topic and companies looking to create an effective innovation culture.

  1. Go deep, not broad. The first thing to know about intrapreneurship is that employees should focus on specific challenges their company already faces. So be hyper-focused when pitching your manager. Explain the issue you’ve identified, how you plan to solve it, the impact your project will have, and the types of resources you need to get it done. Although intrapreneurship endeavors can be great opportunities for cultivating new skills, make sure you have enough baseline know-how to see the project through. Alternatively, you can showcase your leadership instincts by assembling an informal team of colleagues who possess the complementary skills the solution requires. 
  2. Drive productivity. Just as entrepreneurs require room to experiment, intrapreneurs need independence to thoroughly investigate the problem they are trying to solve. The payoff is that, “Intrapreneurs take risks and find more effective ways to accomplish tasks. An intrapreneur, in the most basic sense, is a skilled problem solver that can take on important tasks within a company.” 
  3. Earn buy-in before you present your idea. If you already have a track record of self-directed success, your bosses may give you some leeway. But if you’re new to intrapreneurship and want to make a good impression, find a champion for your idea. Figure out who stands to benefit most from your solution, and get their feedback before running it through the official channels. Having internal support before you’ve even pitched the concept will give you credibility with key decisionmakers.
  4. Own innovation—or else. At the core, intrapreneurship is about innovation: “The means by which large, mature corporates can develop and harness the commercial energy that will grow the business in a constantly changing and fiercely competitive environment.” With the speed of business today, even long-established market leaders face constant threats of competitive disruption. By fostering innovation, companies can stay one step ahead of that hot startup gunning for them.
  5. Leverage the millennial spirit. It’s often said that millennials are natural entrepreneurs, flush with skills like leadership, innovative thinking, and a bias for action. Those talents are prerequisites for intrapreneurial roles inside a company, and millennial professionals can contribute original thinking to innovation teams.
  6. Let mavericks thrive. Companies with more conservative cultures may have difficulty with the concept, but those are also the companies likely in need of radical thinking to identify emerging competitive threats, generate new product ideas, and demonstrate the effectiveness of new ways of thinking. 
  7. Recognize your intrapreneurs or lose them. Data shows that 70% of successful entrepreneurs dreamed up their startup while working at a previous employer. To build your innovation engine, your company needs to find ways to energize and incentivize your employees to be more intrapreneurial, and then capture and implement their best ideas. Your best intrapreneurs are only one step away from forming your industry’s next innovative startup. 
  8. Create a culture of intrapreneurship. Any professional with experience at innovative companies can contribute to creating a culture where intrapreneurship thrives. That means communicating and supporting values like transparency, rewarding proactive behavior (something corporation aren’t always good at), fixing problems as they arise to avoid normalizing the bad, and encouraging healthy internal competition. 
  9. Don’t focus on solutions at first. In some cases, intrapreneurs shouldn’t be given a specific problem to solve. The process of finding a solution entails crossing off possibilities one by one—and one of those discarded ideas might actually be the winner. “It’s better to stay in what we call the ‘problem space’ for as long as possible.” So give intrapreneurs space to develop a deep understanding of the problem before setting them on the path to developing viable solutions. 
  10. Define successDon’t chase trends, you’ll always lag behind. True innovation comes from creating products and services that are singular experiences that other businesses can’t replicate. Setting up dedicated teams focused on discovering where true innovation lies is a cornerstone of long-term customer engagement and long-term success.
  11. Turbocharge intrapreneurship with artificial intelligence. As Entefy has written previously, AI turbocharges intrapreneurship. Here’s how it works. As AI enters the mainstream, cognitive systems assume low-level tasks and free employees to focus on higher-value work. The professionals that will thrive in this environment are—you guessed it—the natural intrapreneurs, who transform their newfound freedom from drudgery into powerful new ideas. 

Intrapreneurship programs are win-wins for businesses and employees. They allow companies to get the best work and ideas from their best and brightest thinkers by giving them more autonomy to do their best work.

Mobile phone

Humans make irrational and emotional decisions. But why?

What separates humans from all the other creatures sharing the Earth with us? One difference is of course our capacity for rational thought and logical decisionmaking. Then there’s our now 50,000-year-old legacy of supplementing our natural capabilities with increasingly sophisticated tools—everything from labor-saving devices to medicines to, most recently, smartphones.

But are we humans always rational? Do we always use our tools optimally? Well, no. Just think of the last time you made a hasty snap decision when deeper analysis would have been better. Or you procrastinated on work in favor of more time catching up on social media.

We live in such an interesting time, where our most ubiquitous computing devices—smartphones—are equal parts Swiss Army knives of powerful capabilities and, when we’re not careful, the ultimate time thieves. But that is changing as cognitive technologies like artificial intelligence move out of the labs and into our lives. Because one thing AI does very well is make recommendations by studying data (about you, about the weather, about diseases, about everything), delivering insights that would otherwise elude us.

It’s just one small step from AI-powered recommendations to AI-powered decisions made on your behalf. But there are two important questions to answer before we hand over decision-making duties to our favorite devices. Do we want technology making decisions for us in the first place? And, if so, what’s gained and what’s lost when smart machines become even more central to our day-to-day lives?

The complex relationship between emotions and logic

Before digging into the technology side of things, it’s worth noting why we so often get things wrong despite our best intentions. Good decisions require putting together the right information in the right way. The more information there is, the more complex the analysis and the easier it is to overlook or misread the details. Of course, we don’t always have the time or energy to gather all the information that might exist, and so we rely—more than we might realize—on gut reactions and rules of thumb. 

An interesting example of this comes from the work of neuroscientist Antonio Damasio. In the 1990’s his research into how emotions impact decisionmaking found that when the brain regions responsible for processing emotions are damaged, people can retain their ability to reason yet become unable to make seemingly simple decisions.

Damasio studied a patient known as Elliot, who at one point was a successful businessman and loving husband. Elliot suffered from a brain tumor which damaged parts of his orbitofrontal cortex, the brain region that connects the frontal lobes with our emotional machinery. 

Elliot retained a good IQ score, and those around him felt that he remained an intelligent and able man. Yet he began leaving work unfinished to the point he was fired from his job, he divorced his wife only to marry someone his family disapproved of, divorced again, and then went bankrupt after going into business with a shady character. 

Damasio found that Elliot could think up many options and ideas regarding certain decisions, but something broke down when actually making a decision. ‘I began to think that the cold-bloodedness of Elliot’s reasoning prevented him from assigning different values to different options,’ recalled Damasio, ‘and made his decision-making landscape hopelessly flat.’

Perhaps it’s here that technology could help. When we lack the time or energy for thorough consideration, our smart devices can come to our aid. With their ability to quickly process data, they should be able to help point us in a direction that’s more logical than our instincts. Except that the technology that could do this still faces some man-made headwinds. Namely, the modern-day realities of choice overload and information overload.

Choice overload

We face and make so many choices every day. Choice seems like a good thing, offering us greater freedom and autonomy, the ability to find something that perfectly suits our needs. But all these options have a downside. When we are asked to make decision after decision, our brains become tired and frustrated. Psychologists call this decision fatigue, and when it sets in it takes a concerted effort to muster the self-control needed to continue making smart choices throughout the day.

Psychologist Roy F. Baumeister’s research into the power of self-control has found that people who fight their urges, such as resisting cookies and sweets, are actually less able to resist subsequent temptations. When he had people watch an emotional movie while explicitly trying not to display their emotions, they became more emotional more quickly on subsequent tests of emotional self-control. 

This effect can be seen in a dramatic way in the justice system. Research into judicial decisions found that prisoners at parole hearings who were evaluated early in the day were granted parole 70% of the time, while just 10% of those judged later in the day received a favorable ruling. 

The judges responsible for grating parole get worn down as the day goes by, and when they lack the energy to make a detailed analysis, they choose the easy option or the one that leaves them more options in the future. 

Interestingly, one solution to this problem is to eat. A new study shows that the brain consumes nearly 20% of the body’s energy and is a prime candidate for increased glucose when we get tired. Another solution is to use technology in ways that reduce, not increase, our options. Easy. Except for our second headwind to technology-powered decisions: information overload.

Information overload

Part of the reason we have so much choice is that we have so much information. When trying to decide something that requires some research, it’s easier than ever to consume article after article hoping to come to a full and complete understanding of the topic, and thus primed to make a perfectly rational and optimal decision. If only our brains would cooperate. Because the human brain doesn’t do well with information overload

For some insight into how the brain can mishandle information, we turn to the world of professional sports. Economist Richard Thaler conducted a study of star players selected during the NFL draft and found that teams often place a disproportionately large value on the early picks. Thaler observed that the scouts responsible for examining players could become overconfident in their decisions the more information they gathered. Thaler wrote, ‘The more information teams acquire about players, the more overconfident they will feel about their ability to make fine distinctions.’

While accumulating information seems like the ideal way to get to the bottom of something, any objective analysis requires not only that we know where and what to look for, but also that we interpret that information correctly

During the search stage, it is easy to favor information that supports or confirms a hunch that we already have. Thanks to all the information available online, it is not difficult to search for what you believe to be true, but this will often leave us with a biased perspective. When there is no expert consensus regarding some idea or hypothesis, we are likely to find many theories, opinions, and observations, which could examine the problem from many opposing or differing angles. Yet we often select the information that supports our initial hunch.

The mind prefers cohesion and holding multiple ideas that don’t work with each other can make us feel uncomfortable—what psychologists call cognitive dissonance. After all, the brain likes nothing better than to jump to a conclusion simply to escape getting overheated. 

The best strategy for countering information overload and cognitive bias is to expose yourself to contradictory information and counter-arguments. But it’s also important to know when to call off the search. Answers aren’t always found in the time we have, and time is a scarce resource. It pays to establish boundaries to your search by limiting how long you work on the problem or raising your threshold for what information you consider relevant.

Using technology to decide

We are not built to make decisions all day long and, when we try, we tend to grow tired and impulsive. Technology can certainly help. Cognitive technologies like artificial intelligence are capable of crunching numbers and solving problems too complex or time-consuming for us. And as artificial intelligence and computing power continue to improve, this ability will only increase. 

Yet before we can fully come to rely on our devices to augment our decisionmaking capacities, we need to establish trust. We need to be sure that the decisions being made are in our best interests, and not the interests of the owner of the platform providing the technology. That’s a big challenge of choice overload: machines need to support us when we need it, not just assume our responsibilities. 

Then there’s the delicate role of information overload: designing systems that interpret data the way we would.

With time and the ongoing advances in AI, we’re not too far off from confidently making the right choices about the tools we’ll use to make choices.

Chairs

Design wisdom for the age of artificial intelligence

Barry Katz is a Visionary Circle Advisor at Entefy. A Fellow at IDEO, he is also a Consulting Professor, Design Group, at Stanford University and Professor of Industrial and Interaction Design at California College of the Arts.

I spent much of last spring immersed in the legacy of Charles and Ray Eames, the legendary design partnership that defined the culture of midcentury America and lives on today in the 21st century’s reverence for Midcentury Modernism. After Charles’ death in 1978, Ray closed the office in Venice, California and spent much of the next decade archiving the record of their historic 45-year partnership (Ray died ten years to the day after Charles). 

One portion of their legacy was deposited to the Library of Congress where it is available to researchers; the remainder resides at the family ranch in northern California where, thanks to the family’s generosity, my students had the opportunity to rummage through models, fabric samples, color swatches, film stills, books, tools, correspondence, photographs, props, and memorabilia. It’s one thing to view the Eames Lounge Chair and Ottoman from behind a velvet rope in the “design” section of a modern art museum, or to watch their iconic Powers of Ten on YouTube. It’s quite another to see the scores of experiments, trials, prototypes, and dead ends that preceded them.

That experience—juxtaposed with my opportunity to observe Entefy’s growth and evolution in designing the first universal communicator—has led me to reflect on the relevance of some of Charles Eames’ key precepts, principles, and parables to design in the digital age. Here are a few:

“The details are not just details. They make the product.”

It’s easy to spot the unresolved details of a physical product: the latch doesn’t close securely, the stitching is uneven, the on/off switch is awkwardly placed. But the same can be said about a software interface, a mobile application, or even the invisible, underlying code that most people will never see. Charles once boasted that he willingly accepted constraints, but never accepted compromises. And so I say to the designers of the digital: No compromises!

“Start from a pure place.”

The giants of Silicon Valley—Hewlett and Packard, Gordon Moore and Robert Noyce, and in successive generations, Steve Jobs—became fabulously wealthy, but no one will ever convince me that HP, Intel, or Apple were built on dreams of wealth. 

I once had the opportunity to ask Jobs what motivated him and he replied, “Things come along now and then that change the way we live, and we happen to be in the right place at the right time to influence the evolution of these things. You know, when you change a vector in its first inch by just a little bit, when it gets three miles out there it’s moved quite a bit.” Nothing there about the size of his bank account.

“After the age of information comes the age of choices.”

For most of human history, we have suffered from a lack of information—about the best season to plant our crops, when to retreat to higher ground, whether to save, to spend, or to invest. Abruptly—sometime around 1945, according to some—civilization was upended and we began to suffer from too much information. Submerged under a torrent of data, bits, pixels, texts, and tweets, we might be poised now to evolve to the next plateau: not more information, or even (as Eames imagined) more choices, but simply the power to get what we want.  

“The role of the designer is that of a very good, thoughtful host, anticipating the needs of his guest.”

This is something that every software engineer, coder, data scientist, hacker, and information architect should take to heart: You are not creating for yourself. You are creating for another person, not an “end-user” (ugh!) but a guest, whom you must welcome into your world. The people who use what you build must be made to feel at home with what you have done.

“Eventually everything connects—people, ideas, objects… The quality of the connections is the key to quality per se.”

This is surely the promise of Entefy’s universal communicator: augmented intelligence that is smart enough to weave together the threads of our ever-more-complicated lives, seamlessly, invisibly. And so Midcentury Modernism enters the 21st century. 

Disruption

AI disruption already underway in these 8 industries [SLIDES]

Artificial intelligence is transforming business-as-usual in a range of industries, all around the globe. Many of these developments are pretty familiar by now: self-driving vehicles, smart homes, and Internet of Things-powered smart appliances. 

Focusing on advancements that don’t always make it into the headlines, these slides highlight the disruptive impact of AI and automation on 8 diverse industries. You can read more about the research featured in this presentation in Entefy’s article, AI disruption already underway in these 8 industries.

Blockchain

Is blockchain the hero cybersecurity needs?

The year was 2009. A mysterious figure (or figures – the jury is still out) named Satoshi Nakamoto had just unleashed Bitcoin, a cryptocurrency that drew equal amounts of excitement and skepticism. Though Nakamoto wasn’t the first person to explore the development of a cryptocurrency, he was the first to bring one to fruition. Bitcoin popularized the concept of digital currencies and sparked impassioned debates about whether a decentralized, digital system could destabilize (and ultimately dethrone) fiat currencies such as the dollar. Bitcoin also introduced the world to the concept of the blockchain.

Blockchain is the technology that underpins Bitcoin and other cryptocurrencies, such as Ethereum and Litecoin. In the eight years since Bitcoin’s debut, blockchain has emerged not merely as a means of creating new types of currencies but as a solution to many urgent concerns—cybersecurity chief among them. 

What is blockchain?

The term blockchain refers to a distributed, decentralized ledger, a database of transactions. Every “block” of information contains the records of previous transactions, creating a comprehensive record. Clusters of data blocks automatically link together through cryptographic algorithms and mathematical hashing mechanisms that validate the information, thus creating chains of data blocks. These chains of blocks give us the term blockchain. Each blockchain then connects to other blockchains via a global network of computer nodes. 

The distributed nature of the technology ensures that no one person or institution controls the data recorded on the blockchain. Instead, designated computers form a node network, and before a new data block is added to the chain, every single node verifies its authenticity. The nodes use automated, cryptographic algorithms to approve additions before adding them to the ledger. 

Knowing how blockchain works explains how cryptocurrencies function, and also why they represent an entirely new form of monetary exchange. When someone makes a purchase with a cryptocurrency like Bitcoin, the Bitcoin blockchain records the transaction. Bitcoin transactions are transparent, in that certain attributes of the participants in a transaction are publicly available; however, users can take steps to shield their identities and protect their privacy

Back to blockchain. A blockchain ledger can record a wide variety of transaction types, some of which we’ll discuss in a moment. Because information recorded on the blockchain cannot be altered by any single party, many people see this technology as a means of enhancing both transparency and security. For instance, the British government is mulling the possibility of using blockchain technology to distribute taxpayer-funded grants, allowing for greater clarity on how the money is used. 

Blockchain’s disruptive potential 

Many blockchain supporters highlight the decentralized nature of the system. And, on the flipside, it’s why so many institutions like banks love/hate the technology. “Big financial institutions are embracing the technology, in part, because they fear it. Since it is relatively easy to use and entails low costs, new players — even small businesses without much capital muscle — could conceivably use it to offer financial services, too.” Financial transactions tend to flow through intermediaries like banks and credit card processors that collect fees covering the use of their payment infrastructure. Blockchain does an end run around that infrastructure.

Here’s an example. In developing countries, blockchain companies have begun using the technology to provide low-cost remittance services for domestic workers who move abroad in search of well-paying work. These workers are often the primary breadwinners for their families, but they lose a significant amount of money due to high remittance fees charged by traditional banks. Blockchain-powered remittance services promise reduced fees and faster transactions, enabling them to send more of their hard-earned money home.  

But there are many potential uses of the underlying technology. At first glance, the Swedish government, anti-human trafficking advocates, and food security experts might not seem to have much in common. However, all stand to benefit from blockchain usage, as the landmark technology can be used to track land registries, identity verification, and food supply chains. The implications of blockchain extend far beyond cryptocurrencies. 

Adding artificial intelligence to the blockchain 

Coupled with artificial intelligence, blockchain’s impact could increase exponentially. Already, developers and entrepreneurs are exploring how the open-source, decentralized nature of the blockchain could propel AI research in the coming years. But the two disciplines could become a real dynamic duo when jointly applied to cybersecurity. 

As Entefy explored earlier this year, artificial intelligence is both a threat and a powerful defense in cyber security. The sheer volume of data being generated globally each millisecond far exceeds the capacity of any single cybersecurity expert to monitor for cyberattacks. We increasingly rely on smart algorithms to flag potential hacks or instances of fraud, and white hat hackers could eventually use AI to anticipate cyberattacks and build better defenses. 

On the other side of the equation, malicious hackers are harnessing AI to deploy ever-more-sophisticated cyberattacks, which is why cybersecurity is rapidly becoming an urgent concern for companies and individuals alike. Blockchain may be able to augment our efforts to thwart dangerous cyberattacks, offering a much-needed ray of hope in the face of hacks such as the Equifax data breach and WannaCry ransomware attack. 

The hero cybersecurity experts need 

Interest in blockchain is undisputedly on the rise. Forbes noted that “blockchain-related technology” was referenced 1,500 times in corporate SEC filings from January-September of this year, indicating real enthusiasm among businesses and investors. It’s blockchain’s time to shine, especially when it comes to cybersecurity: “Blockchain technologies are, after all, the culmination of decades of research and breakthroughs in cryptography and security.” 

The encryption built into the blockchain makes it a natural tool in the quest for enhanced cybersecurity protocols. It also offers another benefit to experts fending off cyber attackers. Rather than use passwords for identity verification, blockchain security platforms can use public and private keys that are associated with encrypted data files that live on the blockchain. 

Unlike centralized password databases, which can be relatively easy targets for hackers, the individual keys are virtually impossible to crack. Some companies that have begun using blockchain technology in their security protocols also use two-factor authentication to add another layer of defense. 

Messaging services also benefit from blockchain-powered security, as these programs can distribute metadata in such a way that prevents hackers from using people’s digital footprints to steal their information. This dissemination of data is important, because it abstracts the information from hackers even more than simple end-to-end encryption. 

Blockchain could help secure cloud network usage as well, by allowing companies to share only partial data with different parties instead of providing full access to their files. Some platforms already use blockchain algorithms to fight cyberattacks by time-stamping activities in a way that pinpoints when attackers try to manipulate their data.  

Blockchain is even being highlighted as a more secure alternative to Social Security numbers, something on everyone’s mind after the Equifax breach. A cryptographic identification system could better safeguard citizens’ data than the traditional Social Security system. Estonia’s government has already moved toward using blockchain and biometrics to record and secure individuals’ information. Along with Dubai, Estonia is also exploring the use of blockchain to protect individual medical records. 

Of course, no system is perfect, especially not one that’s less than a decade old. In a Deloitte report on blockchain and cybersecurity, the organization noted that despite the increased protection of private keys, these can be stolen or intercepted. For instance, the more devices a person uses with their key, the more likely they are to lose control of their data access. Nonetheless, Deloitte also noted that the cryptographic algorithms used to generate private/public key pairs “are hard to break with current computing power.” 

Blockchain systems offer a much-needed upgrade to the way we defend ourselves against malicious hackers, especially when combined with machine learning systems designed to identify suspicious behavior and anticipate attacks. Blockchain provides us with new ways of looking at cybersecurity and defending ourselves against the rapidly advancing hacks of the future. 

Brienne

Brienne on the panel of judges at the Girls in Tech AMPLIFY pitch competition

Entefy’s own Co-Founder Brienne Ghafourifar was in San Francisco to serve as a judge at the Girls in Tech AMPLIFY pitch competition. At AMPLIFY, female founders pitch their startups on stage before a panel of judges consisting of Silicon Valley CEOs, CTOs, investors, and entrepreneurs.

Selected from 294 applications, the 10 finalists represented businesses focused on everything from advertising to healthcare, and security to CRM. The founders were competing for mentorship opportunities, office space, and that lifeblood of entrepreneurship—capital.

Brienne and the other judges evaluated the contestants’ 10-minute pitches and responses during rapid-fire Q&A sessions. The winner, Scollar, was announced at the end of the event when its founders were presented with a giant novelty check for $10,000.

Entefy celebrates Girls in Tech’s mission to support women entrepreneurs around the globe. Congratulations to the 10 AMPLIFY finalists, and a special round of applause to the winners from Scollar.