Touchscreen table

IF hyper.talented THEN apply.to.Entefy

“Choose a job you love, and you will never have to work a day in your life.” —Confucius 

This is an ethos that deeply resonates with us, the powerful idea that the work we do can be simultaneously challenging and fulfilling. After all, when what you’re doing is solving big, hairy, global challenges—with the potential to quite literally make people’s lives better—the separation of “work” and “life” seems tired and outdated. 

From the beginning, Entefy has recruited a global team of amazingly talented people who don’t shy away from a major challenge. We’re a growing tribe of impact-minded professionals who are inspired by Entefy’s mission to create technology that saves people time so they can live and work better. And by people we mean every person everywhere.

Now is a great time to explore opportunities with us. 2017 has already been a big year for the company. We concluded our Series A financing round and were issued a significant patent by USPTO. Then another one. And the momentum in product and core innovation continues to build. 

Entefy is a company that’s all about people. Making people’s digital lives easier and more convenient through first-of-its-kind AI-powered technology. And, internally, creating an environment where our team members work alongside people that feel like family. That’s not just rhetoric. You can hear what it’s like to work here firsthand from the Entefyers themselves in our video playlist, “Entefyers on Entefy.” 

We’ve come far, but there’s a lot of work—challenging, rewarding work—still to be done. That’s why we’re again scouring the globe for new team members to fill multiple open roles. 

Think you’re ready for the challenge of a lifetime? Entefy is hiring.

Data old school style

3,500-year-old data indexing [VIDEO]

Data indexing is central to modern information management. That’s the job of recording dates, storage locations, types of information, and so on. Without indexing, computers can’t locate and recall information quickly. 

The history of information management dates back to a time when clay tablets were hi-tech. The ancient Hittites were great innovators when it came to managing the information about their extensive empire. In this video enFact, we look at a 3,500-year-old method for indexing data.

Read more about ancient data indexes here

Entefy’s enFacts are illuminating nuggets of information about the intersection of communications, artificial intelligence, security and cyber privacy, and the Internet of Things. Have an idea for an enFact? We would love to hear from you. 

AI and entertainment

Here’s how AI turbocharges intrapreneurship

The past two decades have belonged to entrepreneurs. As innovative technologies reshaped our lives, ambitious founders became the icons of the age. Now it’s the intrapreneurs’ time to shine as well. 

If you’re unfamiliar with the term, intrapreneurship is similar to entrepreneurship except that it occurs inside an existing company. Wired described intrapreneurship as a new source of happiness at work, saying: “Whereas entrepreneurship is the act of spearheading a new business or venture, intrapreneurship is the act of spearheading new programs, products, services, innovations, and policies within your organization.” Intrapreneurship programs are win-wins for businesses and employees. They allow companies to leverage their best and brightest thinkers, and they offer talented workers autonomy over their careers. 

As artificial intelligence transforms workplaces and employees’ roles within their organizations, it could also give rise to a more entrepreneurially-minded workforce. That’s not to say that we’re all going to quit our jobs to launch AI-powered startups. But with AI taking over low-level tasks and giving employees the luxury of more time in their days, we will likely see a move toward increased intrapreneurship. 

Leave the grunt work to AI 

Once upon a time, email and digital messages were a novel experience. Now they can be a nuisance. The average U.S. adult sends and receives 224 messages a day. The more often they message, the less productive–and less happy–they are by the end of the workday. No surprise there. How can you feel fulfilled by your work when you spend more than 20 hours a week in your inbox

Imagine if instead of getting caught in low-value email threads, employees were able to reclaim that time and invest it in new ideas. Rather than plodding through their inboxes, they could be researching customer pain points, analyzing anomalies in market trends, and brainstorming improvements to the sales funnel. More importantly, imagine that they could then use their findings to develop innovative solutions that break new ground for their companies. 

AI is making that possible. New tech platforms are capable of parsing and responding to emails and scheduling requests, unshackling employees from their inboxes for the first time in decades. AI is becoming a key tool in data-entry and report-generation, further relieving workers of the least cognitively rewarding tasks. One World Bank productivity expert stated, “All over the world, including in the United States and Europe, jobs are shifting from routine tasks, which are prone to automation, towards interactive tasks, which require advanced cognitive and behavioral skills.” 

The age of intrapreneurship 

AI systems will be the defining technologies of the coming decades, and they’re bringing with them a raft of new workplace trends. Intrapreneurship may well be one of them. Many people turn to entrepreneurship when they feel dissatisfied or unfulfilled in their current work. But the demands of launching a business can exacerbate their stresses, especially given the uncertainty inherent in the entrepreneurship process. Other people may feel entrepreneurial yearnings but resist pursuing them due to financial or familial obligations. Millennials are a fine example of this. Despite expectations that they would be “the most entrepreneurial generation,” this demographic is burdened by heavy student loan debt that makes it difficult to sacrifice a sure paycheck for the dream of self-employment

But that’s where employers have an opportunity. By giving workers freedom to explore and iterate on their ideas, they can produce groundbreaking new products and services while keeping their best people engaged. AI is already leveling the playing field between small businesses and Fortune 500s in terms of resource-efficiency and data-gathering. The next frontier in business competition is sustained innovation, and intrapreneurs will become invaluable sources of great ideas as AI frees them to do better, deeper work for their companies.

Instituting intrapreneurship 

People are happiest when they feel their lives have meaning, and there’s no better way to foster a sense of purpose than to let workers take ownership of their ideas – and the company’s future. 

The purpose of such programs is not to give employees free reign to use company resources on whatever catches their fancy on a given day. Leaders can ensure the success of an intrapreneurship policy by creating a clear structure and criteria for evaluating and approving employee projects. The process for turning employees into intrapreneurs starts with letting employees know which managers to approach with their ideas and how to self-evaluate their pitches to increase their chances of approval. The better everyone understands the business’ current needs and priorities, the more relevant employee initiatives can be. 

This is another aspect of managers’ roles as talent spotters. Managers should watch for employees who show particularly entrepreneurial tendencies. Tell-tale signs include self-motivation, directness in communication, adaptability, and a high degree of commitment to their work. By nurturing talented innovators and encouraging them to share their ideas, managers can set the tone for the types of programs they want to cultivate. 

Becoming an intrapreneur 

Intrapreneurship is a great way for employees to gain control of their day-to-day work experiences. Instead of reacting to assignments handed down from their bosses, they can pursue projects that inspire and motivate them. They can maximize the chances of gaining support for their ideas by keeping two principles in mind: 

1.     Go deep, not broad. The first thing to know about intrapreneurship is that employees should focus on specific challenges their company already faces. So be hyper-focused when pitching your manager. Explain the issue you’ve identified, how you plan to solve it, the impact your project will have, and the types of resources you need to get it done. Although intrapreneurship endeavors can be great opportunities for cultivating new skills, make sure you have enough baseline know-how to see the project through. Alternatively, you can showcase your leadership instincts by assembling an informal team of colleagues who possess the complementary skills the solution requires. 

2.     Earn buy-in before you present your idea. If you already have a track record of self-directed success, your bosses may give you some leeway. But if you’re new to intrapreneurship and want to make a good impression, find a champion for your idea. Figure out who stands to benefit most from your solution, and get their feedback before running it through the official channels. Having internal support before you’ve even pitched the concept will give you credibility with key decision-makers.   

We can all be intrapreneurs now

The most successful intrapreneurs – and the companies they work for – will use artificial intelligence to not only automate repetitive tasks but to drive progress as well. As employees become more familiar with AI platforms, they should look for ways to further leverage those capabilities within the organization, always thinking about new and better innovation. 

Founding a business isn’t an option for everyone. But professionals throughout the workforce can nurture their entrepreneurial instincts and develop fulfilling, self-led careers through the opportunities AI makes possible. And some of the most interesting applications of AI may well come from a new class of intrapreneurs.  

Manager

Here’s how to become a super-manager in the age of AI

We’re endlessly curious about how artificial intelligence will transform our working lives and reshape the way we think about our careers. That’s why our team at Entefy has examined what the AI-powered office of the future will look like and the types of learning skills that will be at a premium in the coming decades. 

Now we’re taking a deeper look at what AI means for managers. The superstar managers of the AI era will be much more than scheduling ninjas or efficient task managers. They’ll serve as bridges between AI systems and their team members, guiding them toward more innovative and collaborative pursuits. Finally rid of repetitive and cognitively uninteresting tasks, managers will be free to unleash the full breadth of their creative skills and leadership acumen.  

The role of AI in management

Before looking at the evolving role of managers, let’s consider the changes AI is already driving in companies throughout the country. At present, managers spend 54% of their time on administrative tasks like scheduling and logistics coordination. As artificial intelligence systems become increasingly capable of fielding appointment requests, responding to emails, and generating quarterly and annual reports, managers will be able to redirect their attention to richer, more challenging priorities. 

Because today’s managers are still directly involved in administrative duties, they spend 10% of their time on strategy and innovation and only 7% on developing their in-house talent and engaging stakeholders. AI can improve that ratio, enabling managers to double the time they spend collaborating on new initiatives and investing in personnel and community development.   

Managers should anticipate big changes ahead as AI becomes integrated into their workflows. Not only will they serve as leaders and facilitators, as they do today, they’ll find their analytical and decision-making skills called into sharper focus. Judgement work, which requires a keen understanding of data and of its human impact, will become paramount. New skills will be needed and existing skills sharpened, like digital aptitude, creative thinking, data analysis, and strategic development.  

Our research suggests that super managers in the age of AI will inhabit 3 distinct roles simultaneously: the empathetic mentor, the data-driven decision-maker, and the creative innovator. 

Managers as empathetic mentors 

Freed from the mundanity of scheduling and logistics, managers will be able to devote time to helping their employees improve their skills. We know that AI will change the way many different departments work, so team members will need to adapt through tech trainings and rethinking their contributions to the company. Leaders will need to hone their “outcentric” management skills to advance their team’s development, meaning that they’ll need to nurture employees’ abilities to ensure everyone actively contributes. In a sense, managers will become skills assessment experts, identifying workers’ strengths and molding them into more well-rounded team members.  

Managers will also become both students and teachers of AI systems. In a recent survey of 4,000 workers across the U.S., U.K., and Germany, the majority felt underprepared to fully leverage AI’s benefits. They expressed optimism that technology will make their workplaces more collaborative and will strengthen relationships among team members. But they’ll need their managers’ guidance on how to maximize the tools that are rapidly becoming available to them.  

While not all managers will hold explicitly technical roles, they’ll still need to learn how to approach AI technologies like machine learning as a non-technical leader. Then they’ll have to train their colleagues on how to use those tools. At the very least, they’ll need to connect the dots between what AI platforms can do and how those functions correspond to the team’s goals. As workers transition to using AI assistants for data entry and scheduling, managers may need to engage in some handholding as employees adapt to their new workflows. 

While we know that AI-powered companies can become rich environments for learning and innovation, change isn’t always easy. Managers will be responsible for easing the path for their employees by helping them develop new professional goals in light of AI and strategizing how they can use AI to do their jobs more effectively.

Managers as data-driven decision-makers

Equipped with deep insights and forecasts generated by AI, managers will be better prepared to make smart decisions on behalf of their companies. Across departments, they’ll have more and better information to help their teams excel. Tools such as predictive analytics will aid them in deciding which initiatives are most likely to resonate among customers to minimize costs and maximize ROI. Machine learning will analyze expected outcomes based on company data, lowering the risk of poor investments. 

However, human judgement will still be critically important in executive decision-making. An AI program might recommend particular cost-savings measures or produce a recommendation in favor of a new business initiative. However, a human manager will need to decide whether the process of achieving those aims aligns with the company’s mission and values. As a 2016 Deloitte study titled “Talent for survival: Essential skills for humans working in the machine age” asserts, it’s not all about technical analysis. Numbers are important, but they don’t always tell the full story. Managers can bring empathy and context to the picture to decide the best course for the company’s long-term goals, which aligns with Deloitte’s prediction that problem-solving, social skills, creativity, and emotional intelligence will serve as vital complements to increased technical skills. 

Managers may also evolve into explainers, a new category of jobs created by AI. Explainers will monitor the effects AI algorithms have on the business’s goals and communicate those to the company’s leadership. They’ll also determine which strategies will benefit from AI assistance and which require more traditional approaches. 

Managers as creative innovators

The word creativity often brings to mind painters or writers. But in the AI era, managers across organizations—and not only those in traditionally creative departments—will be called upon to create unique solutions and products. Creativity is no longer solely for the creatives.

In this sense, creativity in leadership roles represents the ability to distill complex ideas, identify novel patterns, and devise innovative solutions. Managers must look at business challenges in different ways and from new perspectives. 

Managers may soon be required to synthesize new ideas, analyze the potential impacts on their companies, and then lead production and promotion campaigns to ensure that those are realized. We’ll see such dynamics play out in all different departments as managers turn their attention away from administration and toward increased innovation. For instance, with AI’s assistance, managers can overhaul outdated workflows, implement more dynamic project management strategies, and use customer data to brainstorm truly disruptive ideas. Predictive tools will allow them to better understand the market to determine which concepts are viable and which new endeavors are likely to bear fruit. The World Economic Forum puts it plainly: “Creativity and critical thinking skills are increasingly important in an automated workforce.”

AI will help managers do their jobs better even as it makes the work itself more fulfilling. The coming changes herald an era in which managers have the bandwidth to focus on complex, high-profile tasks like employee development and innovative brainstorming. The automation of low-level tasks and the powerful insights generated by predictive analytics and other AI-powered systems will give them more time and more data than they’ve ever had before. Which sounds like a recipe for high-quality performance. 

Confirmation bias

The hazards of confirmation bias in life and work [VIDEO]

People spend 36% more time reading articles that support their own established opinions. Why do we put so much effort into confirming what we already believe? The culprit is confirmation bias, the brain’s tendency to interpret the world in ways that are consistent with existing expectations. Confirmation bias is a mental shortcut that lightens the brain’s cognitive load, but the convenience comes with its own challenges. 

This video presents an overview of how confirmation bias works, and how you can teach yourself to burst your own bubbles of bias. Read more about the hazards of confirmation bias in life and work here.

Horse

A timeless lesson in management

Dr. Rudd Canaday is Entefy’s Software Architecture Fellow. He is a co-inventor of the UNIX operating system and was a graduate student at MIT’s pioneering Computer Science & Artificial Intelligence Lab. Rudd shares some of his experiences during computing’s early days below—as well as a timeless career insight.

In 1975, when I was 37, I got my first and only job at Bell Labs that was not in research. Bell Labs had written, a long time before, an elaborate software system named DIR/ECT, for DIRECTORY, used in printing white pages phone books. This sounds like a simple job, but the rules for how listings were arranged, alphabetized, and displayed in the white pages were arcane, dating back to the turn of the 20th century. This software was obsolete, using batch processing with data on magnetic tapes. It was notoriously hard to use and error prone. I was given the unenviable task of figuring out what could be done.

The head of the department maintaining the DIR/ECT system and I looked at the system and decided that it was not practical to bring it into the (then) modern on-line age. We decided that a new system needed to be built. Officially named “DIR/ECT II,” we called it “the upgrade” to emphasize that it would be fully compatible with the old system. The work of maintaining the old system, and of building a new one, was funded by the “operating companies,” the telephone companies that were part of AT&T at the time. So, the DIR/ECT department head and I had to convince the operating companies that a new system was needed, and that they should fund it. Which wasn’t that difficult, since the old system was so challenging to use. My estimate to get the job done? Three years.

I formed a new department focused on the upgrade that eventually grew to 50 people under 7 supervisors. Meanwhile next door, the original DIR/ECT department of about 30 people maintained the old system.

DIR/ECT II was by far the largest department I had managed—and was to be my least successful managerial experience. Since it was being paid for by the AT&T operating companies, I had the job of presenting the project status in a twice-yearly meeting and convincing them to continue funding the project. At first these meetings were easy, but as we missed our deadlines they became more difficult and much more expensive in computer time. In the end, it took us 5 years, not 3, to complete DIR/ECT II.

Part of the problem was my decision early on that we would build the system on UNIX using a DEC (Digital Equipment Corp.) minicomputer. DIR/ECT II was the first AT&T product built on UNIX, and perhaps the first on a minicomputer. The decision to use UNIX was controversial largely because UNIX only ran on minicomputers. It turned out to be an unwise decision. The computational demands of the system were much higher than I had originally anticipated. 

As a result, a couple of years into the project we saw that we needed a faster machine. Fortunately, DEC was about to introduce a new machine that promised faster speed. After all, previous releases of new DEC machines had each doubled the speed of their predecessors. But this time the new machine didn’t perform as well. Without the extra speed that we had been counting on, we were forced to spend quite a bit of time trying to make our system more efficient. But when we finally went live with DIR/ECT II, performance was only marginal.

Meanwhile, the DIR/ECT department head was wrestling with the problem of motivating his people to work on a dinosaur while next door their colleagues were building the sexy new system. To win, he decided to challenge his team to improve the old system so drastically that by the time DIR/ECT II would be available the operating companies wouldn’t see a need for the new system. No one thought this was possible. The old system was just too cumbersome. 

In the end, though, we agreed that the challenge would be good for his team. My colleague invented the slogan “Obviate the Upgrade,” and threw down the gauntlet. To our surprise, by the time my team finished DIR/ECT II, the old system had been transformed into a modern, on-line system. And, indeed, after the new system had successfully passed its trial period, none of the operating companies wanted it and it was abandoned. 

Moral of the story? Competition isn’t always a bad thing as it can lead teams to accomplish remarkable things.

Fishing hook

Hiding in plain sight: 8 digital security threats in everyday life

What’s more valuable to you: protecting digital privacy…or free pizza?

It turns out a startling number of people in a Stanford study chose pizza over protection. In the study, 3,108 undergraduate students were told they were joining a study on the use of Bitcoin for making payments. Students answered survey questions about their views on digital privacy to capture data on their stated preferences. 

A group of students were then offered a free pizza if they divulged the email addresses of three of their friends. The overwhelming majority of students took the cheesy pizza for the low cost of compromising someone else’s digital privacy. 

Interestingly, students who had described digital privacy as important to them were just as likely to choose pizza as the students who had no strong views about privacy. One researcher commented that people ‘are willing to relinquish private data quite easily when incentivized to do so.’ Apparently, the cheesier the incentive the better.

‘Generally, people don’t seem to be willing to take expensive actions or even very small actions to preserve their privacy,’ the study’s author stated. ‘Even though, if you ask them, they express frustration, unhappiness or dislike of losing their privacy, they tend not to make choices that correspond to those preferences.’ Add this to the list of complex paradoxes in the digital world: the deep disconnect between what we say about digital privacy and the actual choices we make. 

These days, threats to our digital privacy and security can come from practically anywhere. And some of them are hiding in plain sight. Take a look: 

1. Think you’re safe from privacy violations at work? You’ll probably want to know that one report estimates 15% of the Fortune 500 make use of secret tracking devices hidden in lights and ID badges. One surveillance vendor reports that 350 different companies are using its products to monitor “conference room usage, employee whereabouts, and ‘latency’—how long someone goes without speaking to another co-worker.”

2. The CEO of iRobot, the maker of the popular Roomba automated vacuum cleaner, caused a stir after apparently suggesting the company was seeking deals to sell data about the layout of users’ homes to third parties. The company later clarified that it didn’t have any plans to sell the data without users’ consent. The situation shines a spotlight on the ongoing tension between personal privacy and the monetary value of certain types of consumer data. 

3. Achieving the elite heights of pro sports apparently doesn’t make you immune to privacy threats. The NBA and its players’ union are in conflict over how much data can be collected and shared using wearables like fitness trackers. The player’s union is seeking control over what data is collected and how it gets used. Exactly the same legal issues and ethical considerations that are being raised as more and more employers deploy wearables to their employees. 

4. Your car is watching. Computer systems in many newer cars create records of pretty much everything you do on the road, from logging telephone calls to recording how fast you drive. The challenge for consumers is figuring out what’s being collected, and where it goes afterward. The legal situation in the U.S. is murky, with no one law covering data collection by automobiles.

5. Be careful what you say in front of Barbie. A study from University of Washington researchers demonstrates how the Internet of Toys is raising new privacy questions. In interviews with parents and children about the use of Internet-connected toys, the researchers found that children were unaware that their toys were recording their voices, and that parents worried about privacy pretty much any time the toys were out of the toy boxes. 

6. A lighthearted Facebook meme may unintentionally telegraph answers to your banking security questions. The post, called “10 Concerts I’ve Been To, One is a Lie,” asks users to share information about concerts they’ve attended. The problem is that “Name the first concert you attended” is a common security question used by banks and other financial institutions for online authentication. Phishing aside, the meme can also “telegraph information about a user’s age, musical tastes and even religious affiliation — all of which would be desirable to marketers hoping to target ads.” 

7. Usage-based insurance (UBI) is the term for insurance products that are priced according to specific usage factors. UBI auto insurance, for example, is priced on factors like how often a driver uses their car, how fast they take corners, and their average speed. University researchers were able to demonstrate that it’s possible to reveal personal data by pointing an AI algorithm at usage-based insurance data stored in the cloud. One researcher commented, ‘An attacker only needs one part of the information provided to a UBI company to discover a driver’s whereabouts, home, work, or who they met with.’

8. An audit by the Internet security nonprofit Online Trust Alliance found that 6 of the 13 “Free File Alliance” tax websites approved by the IRS provide inadequate security and privacy protection. The report states, “Criminals are increasingly penetrating IRS systems, targeting e-file service providers and harming consumers through bank account take-overs, identity theft, ransomware and compromising completed returns to redirect tax refunds.” As if April 15 wasn’t stressful enough.

All of these cases point to an important reality of the digital age: New privacy and security questions are created every time a new device is connected to the Internet. Which is why people will no longer think you’re crazy if you ask, “Is that Roomba watching me?”

Read or speak

Communication speed test [VIDEO]

We tend to read far faster than we speak, but when it comes to communication, speed isn’t everything. When you need to get your message across, context is where it’s at.

In this video enFact, we use some famous lines from Shakespeare to explore how meaning changes with the way words are delivered. An important point in today’s digital world where how you send a message can be as important as what you say.
Read the original version of this enFact here.

Entefy’s enFacts are illuminating nuggets of information about the intersection of communications, artificial intelligence, security and cyber privacy, and the Internet of Things. Have an idea for an enFact? We would love to hear from you. 

Brain

Learn new skills faster using your memory’s biggest weakness

In this era of lifelong learning—in which rapid technological advances are forcing professionals to acquire new skills throughout their careers—getting better at the process of learning is critically important. One element that’s often left out of discussions on learning is…forgetting. Because it turns out that forgetting is as important to learning as is remembering. 

Wouldn’t a perfect memory make life so much easier? No more forgotten names or misplaced keys. Everything you ever read or watched—podcasts, documentaries, books—available forever in perfect fidelity. Learning would be a cinch, just imagine how much smarter you’d become. 

Except: The brain doesn’t work that way. Remembering is an essential element in learning, but it isn’t everything. Because as counterintuitive as it sounds, forgetting is just as valuable as remembering.

Take the case of Solomon Shereshevsky. Shereshevsky had what most people would call a photographic memory. He could recall poems in foreign languages and long mathematical equations years after even a brief exposure. 

For 30 years in the early 20th century, Shereshevsky was studied by Russian psychologist Alexander Luria, who diagnosed him with an extreme case of synesthesia. It’s a condition in which a person’s senses become tangled in unusual ways, like experiencing smells in response to sounds, or colors in response to numbers. 

This rich experience of even mundane moments contributed to Shereshevsky’s exceptional memory. Yet his prodigious memory did not result in greater intelligence. In fact, it caused him difficulties with recognizing faces and extracting meaning from printed text. Shereshevsky was so burdened by his memory that he began writing words down and burning them in an unsuccessful attempt to forget something. 

A perfect memory would overwhelm us with details. Which is why the act of learning something new requires being able to both remember what’s important and forget what’s irrelevant. But how do we nudge our brains to forget the noise and remember all the important stuff? 

Focus and flow

Take a moment to recall one of your most vivid memories. It likely consists of a mixture of sensations—sights, sounds, emotions, smells. These rich experiences are wired together using many different areas of the brain. As we learned from Shereshevsky, this variety is a natural boost to memory. 

The challenge is that learning, the act of sitting down to study, isn’t usually an experience rich with sensory inputs. We can overcome this shortcoming by consciously directing focus and attention, giving the memory more to work with. Deep focus signals to the brain that the information being focused on is significant and worth remembering.

This first thing to keep in mind about attention is that it comes in limited supply. Our mental workspaces can manage up to 7 bits of information at once, a natural limitation that can make it hard to focus on one thing for an extended period of time. Which is why most people have developed the habit of jumping from topic to topic or activity to activity. Multitasking their days away. 

Difficult or not, deliberate focus encodes information into memory and is central to effective learning. A mind divided between disparate tasks is one whose attention and focus are jumping about, reaching for one thing only to put it down moments later to pick up something else, never taking the time to grapple with one item in a meaningful way. 

This is no way to learn. If you don’t find what you’re studying interesting enough to fully focus on, then your brain isn’t going to find it important enough to store long term. The more focus we devote to what we’re learning, in duration and effort, the greater the chance that information can form a lasting impression. 

An interesting thing occurs during those times when we invest all of our attention in a task. We enter what psychologists call a state of flow. Being in the zone. It’s the place where sports stars perform at their peak and musicians hit every note. When we’re in this state, our sense of time becomes warped, we are absorbed completely by what we’re doing, and our performance raises a notch. 

One of the conditions required for entering a state of flow is a challenge. The brain requires an obstacle to overcome, one that isn’t too difficult or too easy. If it is easy, we are more likely to become bored; if it is overly difficult, we will fail or simply give up. The key lies in possessing the skills necessary to complete the challenge even as you push yourself to your limits. 

Not only are these the moments when we perform at our best, they are the moments we best remember. So how do we encourage such a state when we’re trying to learn something new? Learning something new is certainly a worthy challenge, but how do we ensure the right balance between our skills and the difficulty of the task?

Learning isn’t consuming information

Even perfect control over our focus and attention doesn’t make us immune to forgetting. To make it stick, we need to regularly interact with what we’re learning.

Jumping from podcast to video to article to book might appear like a good use of time, but it’s a strategy more likely to cause you to forget most of the details you come across. In fact, unless you purposefully revisit what you’re learning, you’ll end up with a mental jumble of accurate, inaccurate, and misremembered information. 

The rate of forgetting is a documented phenomenon. Hermann Ebbinghaus began studying the effects of repetition on memory formation in 1879. He tried to memorize long lists of nonsensical syllables and found that while he would forget quickly after the first attempt, forgetting would progressively slow down the more he retrieved the information. He plotted these observations onto a graph called the “forgetting curve.”

We learn two things from Ebbinghaus and his forgetting curve. The first is that repetition is a necessary part of learning. It’s almost impossible to learn something effectively from a single exposure. The second important finding is that this repetition is most effective after some time has elapsed. Forgetting is beneficial to learning. 

Here’s how it works. You study for a while and then allow time to pass, inevitably forgetting important elements of what you wanted to remember. If you take the time to revisit that lost information, your memories are jogged and the process of forgetting slows dramatically. After several exposures, forgetting ceases altogether and you’ve learned something new. 

Learning through revisiting what we forget creates long-term memories. But it also allows us to look upon what we’re studying with a fresh perspective, often leading to new associations among facts. 

There is another important element to this idea of spaced repetition: We need to retrieve the information “manually” by deliberately recalling it. Recall helps memory formation more than simply looking back over the original material. 

Think of it this way. Once you’ve learned what you’re trying to learn, you probably aren’t going to keep carrying around your study material. Nor do you generally have time to do a web search every time you want to cite a fact. Ensuring that your brains houses the information you want requires practice retrieving it, not re-consuming it. We shouldn’t just practice putting the information in, we need to practice getting it out by testing ourselves. 

Learning is timeless, knowledge is not

We’ve all misplaced someone’s name in an inopportune moment, had the right answer on the tip of our tongue, or noticed how different is a friend’s recollection of a shared experience. The brain holds onto only a very small fraction of everything we’ve ever experienced. Forgetting helps us extract what matters and what is useful, and not waste resources by filling the brain with superfluous information. 

Taking advantage of the power of forgetting requires reevaluating how we learn. Quickly glancing over an article or speeding up a podcast to twice its normal speed will not help learning. Effective learning demands more care and deliberate effort. First, we must concentrate on the subject with everything we have, aiming not for speed but rather the slow extraction of meaning and understanding. Second, we must revisit this new knowledge periodically, to ensure that what we remember is accurate and comprehensive. 

Gone are the days when you finished learning. When a Bachelor’s degree meant you held all of the knowledge you would need in your career. With the future defined by constant change, continuing education is important and relevant to all of us. 

When it comes to teaching ourselves the best ways to learn continuously, we shouldn’t be focused on the end result but the process. After all, specific knowledge can become obsolete practically overnight. The ability to quickly and effectively learn new skills is timeless.

Carjacking

The next frontier of cybercrime: car hacking

Dive under the hood of the average car and you’ll find that 100 million lines of code are at work running or supporting practically every system. Cars with first-generation assisted driving systems have 200 million or more lines of code, and climbing.

The challenge is that bugs in code can create security vulnerabilities that provide hackers opportunities to cause problems. Coming soon to a headline near you: a computer virus that shuts down specific models of cars, or a ransomware attack against a major car brand. It’s looking less and less like science fiction.

After all, the more lines of code, the more potential vulnerabilities. One estimate of defect density in code suggests that large, complex programs—like automobile operating systems—can carry 1 vulnerability per 1,500 lines of code. Suggesting more than 66,000 potential vulnerabilities in the average automobile. No wonder cybersecurity is such a big deal at auto makers. It’s up to them to ensure car hacking doesn’t become the new carjacking.
Entefy’s enFacts are illuminating nuggets of information about the intersection of communications, artificial intelligence, security and cyber privacy, and the Internet of Things. Have an idea for an enFact? We would love to hear from you.