Artificial Intelligence

What Is Machine Learning?

What Is Machine Learning?
Machine learning applications include speech recognition, product recommendations, facial recognition, fraud detection, customer service chatbots, and self-driving cars Image from Pixabay
Eddie Huffman profile
Eddie Huffman October 25, 2022

Machine learning, a subfield of artificial intelligence, uses algorithms that scan massive data sets and modify their behavior based on what they find.

Computer Science Programs You Should Consider

Advertisement
Article continues here

War’s remnants—chiefly landmines and other unexploded ordnance—kill and maim thousands every year. Scientists at Oklahoma State University are working on a new approach to find and safely dispose of these deadly vestiges of warfare: machine learning. Using drones as part of a sophisticated detection system, these researchers are employing the latest technology to make a slow, dangerous process faster and safer.

Cleaning up war zones is one of many real-world applications of machine learning. Neural networks also make their presence felt in healthcare, transportation, manufacturing, entertainment, and beyond, from identifying the origin of elusive cancer cells to powering recommendation engines that suggest your next song on Spotify or TV show on Netflix.

What is machine learning? This article examines that question and also discusses:

  • Benefits and drawbacks of machine learning
  • Jobs related to machine learning

What is machine learning?

Machine learning is a branch of artificial intelligence in which computers can use experience to learn and perform beyond their original programming. The computer takes a set of data (such as photos or bank transactions), analyzes it, and makes predictions or discovers patterns. The bigger the dataset, the better the information it produces.

It’s an optimization tool that has increasingly become a part of everyday life. Applications include speech recognition, product recommendations, facial recognition, fraud detection, customer service chatbots, and self-driving cars. “In just the last 5 or 10 years, machine learning has become a critical way–arguably the most important way–most parts of AI are done,” explains Thomas W. Malone, founding director of the MIT Center for Collective Intelligence. “So that’s why some people use the terms AI and machine learning almost as synonymous … most of the current advances in AI have involved machine learning.”

Three basic paradigms underlie machine learning:

  • Reinforcement learning, in which a machine is trained to take appropriate actions to maximize a reward in a given situation
  • Supervised learning, in which initial input data determines appropriate action
  • Unsupervised learning, in which unlabeled datasets are analyzed and clustered independently
Advertisement

“I’M READY FOR A DEGREE!”

University and Program Name Learn More

Benefits and drawbacks of machine learning

The breakthroughs enabled by machine learning have made an outsized impact on the world in the 21st century, from treating cancer to putting self-driving cars on the road. New machine learning models come online every day.

Even the most enthusiastic proponents of artificial intelligence and machine learning, however, acknowledge that the technology has severe limitations. “AI has made truly amazing strides in the past decade, but computers still can’t exhibit the common sense or the general intelligence of even a 5-year-old,” said Yoav Shoham, professor emeritus of computer science at Stanford, in 2017. Researchers at DeepMind have taken a step back, creating a deep learning system that gathers knowledge the way a baby does.

Here’s a look at some of the things machine learning algorithms do well and some of the areas where the technology falls short.

Thinking big

Automation—including the ability to process massive amounts of data in a short time—represents one of the biggest advantages of machine learning. You could search the internet yourself, for example, but it would take you years to do what Google can do in an instant. “It may not only be more efficient and less costly to have an algorithm do this, but sometimes humans just literally are not able to do it,” said Aleksander Madry, director of the MIT Center for Deployable Machine Learning.

Limited capabilities

Machine learning still depends heavily on its original programming. As a process goes into action, it might go off the rails. “We don’t yet have sufficient control systems that understand when the model might be going out of whack, or whether we’re really using it for the right cause,” observes Claudia Perlich, chief scientist at Dstillery, a marketing and advertising analytics company. “Most ML models typically are built for a specific use case, and the person who built it understands the boundaries. But beyond that, generalization becomes questionable, at best.”

Targeted treatment

Doctors can’t always pinpoint the organ or body part where cancer originates. That means oncologists have to resort to generalized treatment that can do collateral damage and have limited success. Machine learning is helping researchers at the Koch Institute for Integrative Cancer Research at MIT and Massachusetts General Hospital analyze cancer cells and predict their origins via pattern recognition. “Machine learning tools like this one could empower oncologists to choose more effective treatments and give more guidance to their patients,” says Salil Garg, a member of the research team. The decision-making ultimately falls to the doctors, but machine learning that analyzes big data helps them make much better-informed decisions.

Boosting manufacturing

Manufacturers can use machine learning to head off problems and improve productivity, according to ELEKS, an international software company founded in 1991 whose software runs power systems across Eastern Europe. An article on the ELEKS website identifies six areas in which machine learning can provide valuable metrics and enhance the manufacturing process:

  • Predictive maintenance
  • Predictive quality and yield
  • Digital twins (a real-time digital model of a physical object used for diagnostics, production evaluation, and performance predictions)
  • Generative design/smart manufacturing
  • Energy consumption forecasting
  • Cognitive supply chain management

Enabling new threats

Machine learning has undeniable value as a cybersecurity tool, but its usefulness comes with a cost: malicious hackers can employ it as well. The Centre for European Policy Studies (CEPS) warns that machine learning techniques “will make sophisticated cyber-attacks easier and allow for faster, better targeted, and more destructive attacks.” The technology will introduce additional threats and change the character of threats.

A strong human element in the form of supervised learning will help combat such threats, CEPS says: “Forms of controls and human oversight are essential. Furthermore, AI systems, unlike brains, are designed. Therefore, all the decisions upon which the systems are designed should be auditable.”

Saving time, relieving tedium

Need to review scores of loan documents or business contracts? There’s an app for that. Machine learning can eliminate the need for people to plow through endless documents, freeing them from repetitive, mind-numbing data analysis and allowing them to focus on higher-level duties. “Automating parts of a job will often increase the productivity and quality of workers by complementing their skills with machines and computers, as well as enabling them to focus on those aspects of the job that most need their attention,” says Irving Wladawsky-Berger, a former IBM executive who is now a visiting lecturer in information technology at the MIT Sloan School of Management.

Jobs related to machine learning

Skilled specialists create the machine learning technologies that have revolutionized the way the world works. Here are some of the jobs that will allow you to join that revolution:

  • Software engineer and software developer: Create software solutions.
  • Data analyst: Access, assess, and share data.
  • Machine learning engineer: Build and manage platforms.
  • Data scientist: Collect, analyze, and interpret data.
  • Computer vision: Design and implement image-processing algorithms.
  • Natural language processing scientist and computational linguist: Help computers learn human language.
  • Business intelligence developer: Track business and market trends.
  • Human-centered machine learning designer: Foster systems that learn from and collaborate with humans.

What degree do you need to work on machine learning?

The path to a career in machine learning most commonly begins with an undergraduate degree in computer science, computer engineering, or a related field such as physics, statistics, applied mathematics, data science, or data analytics. Machine learning is one of the specialties offered in the School of Engineering at Case Western Reserve University. Students earning an M.S. in computer science at Southern Methodist University can specialize in artificial intelligence and machine learning.

Some schools, including Indiana University and Carnegie Mellon University, offer a bachelor’s degree in artificial intelligence. Schools offering a certificate program in machine learning include the Massachusetts Institute of Technology and the University of Washington.

Once you’ve learned the difference between dimensionality reduction and decision trees as an undergrad, you may be able to enhance your job prospects and earnings potential with an advanced degree. Carnegie Mellon offers both a master’s degree and a Ph.D. in machine learning. The Georgia Institute of Technology offers a Master of Science in Computer Science with a specialization in machine learning, as well as a Ph.D. in machine learning.

How useful is this page?

Click on a star to rate it!

Since you found this page useful...mind sharing it?

We are sorry this page was not useful for you!

Please help us improve it

How can this content be more valuable?

Questions or feedback? Email editor@noodle.com

About the Author

Eddie Huffman is the author of John Prine: In Spite of Himself and a forthcoming biography of Doc Watson. He has written for Rolling Stone, the New York Times, Utne Reader, All Music Guide, Goldmine, the Virgin Islands Source, and many other publications.

About the Editor

Tom Meltzer spent over 20 years writing and teaching for The Princeton Review, where he was lead author of the company's popular guide to colleges, before joining Noodle.

To learn more about our editorial standards, you can click here.


Share

Computer Science Programs You Should Consider

Advertisement

You May Also Like To Read


Categorized as: Artificial IntelligenceComputer ScienceData ScienceInformation Technology & Engineering