Machine Learning vs Artificial Intelligence: What's the Difference?
According to the Bureau of Labor Statistics, artificial intelligence and [...]
This website may earn a commission if you make a purchase after clicking on a product link in this article
It’s difficult to examine the history of computer science without adjusting the lens to discern how race and gender discrimination determined who was recognized and profited from the advances in this technology. Books like Programmed Inequality by Mar Hicks and Margot Lee Shetterly’s Hidden Figures (later made into the 2016 Academy Award-nominated film) provided a more accurate and inclusive history detailing women’s significant contributions to the field of computer science.
Women have been in computing since before computers were machines, when complicated calculations were done by hand using brainpower. Predicting the return of Halley’s comet (discovered generations earlier) in the mid-1700s offers one example of early computing. A small group divided the calculations and share the work applying the laws of gravity to plot the comet’s return. Alexis-Claude Clairaut led the effort alongside Jérôme-Joseph Lalande and Nicole-Reine Lepaute, a clockmaker’s wife gifted in math. At a time when women were not included in science as a profession, the team accurately predicted Halley’s return past Earth.
By the 19th century, governments were collecting large amounts of data that needed processing—for astronomy and navigation, but also surveys and census data. Women were recruited as a cost-saving measure since they were paid less than men for their calculation work. Middle-class, educated young women formally trained in mathematics were brought in to work at the Harvard Observatory processing astrological data from its powerful telescope, and were later recruited by the US Army to calculate artillery trajectories during World War I, and again in World War II. Data processing in the pre-electronic age was seen as dull and repetitive—and was categorized as women’s work.
This gendered distinction would follow in later years with women working on software development for modern computers. As described in the New York Times Magazine article “The Secret History of Women in Coding,” “When the number of coding jobs exploded in the ’50s and ’60s as companies began relying on software to process payrolls and crunch data, men had no special advantage in being hired… employers simply looked for candidates who were logical, good at math and meticulous.” Women fit the bill: according to gendered stereotypes of the era, they were better suited to this painstaking work than men. In fact, “The 1968 book Your Career in Computers stated that people who like ‘cooking from a cookbook’ make good programmers.”
After World War II and during a labor shortage in the 1950s, NASA hired a group of exceptional Black mathematicians to help the US win the “Space Race”: Dorothy Vaughan, Mary Jackson, Katherine Johnson, and Christine Darden were just a few of the women hired for the work. “At its bases, NASA employed nearly 80 [B]lack women as computers,” says Margot Lee Shetterly, author of Hidden Figures. One of them, Katherine Johnson, was so revered for her abilities that in 1962 John Glenn asked her to personally verify the flight path of his first launch into space on the Friendship 7 mission. “The astronauts didn’t trust the newfangled digital computers, which were prone to crashing. Glenn wanted human eyes on the problem.”
In more recent history (after the very early Babylonian analog contributions to the world) we can outline the evolution of modern computer science. Ada Lovelace was considered the first computer programmer in her work beginning in the 1830s in England with inventor Charles Babbage and his metal-geared machine that could execute if/then commands and store memory. This co-ed partnership of invention and calculation would grow and develop in countless ways over the next 200 years, leading the world from this early attempt at machine learning to the sophisticated hand-held computers we can’t seem to put down today.
University and Program Name | Learn More |
The University of Tennessee:
Online Master of Computer Science
|
|
Merrimack College:
Master of Science in Computer Science
|
|
Stevens Institute of Technology:
Master of Science in Computer Science
|
|
Tufts University:
Master of Science in Computer Science
|
For a clear history of the evolution of computer science, it’s helpful to flesh out a timeline of some of the main events that helped build the industry.
After Lady Ada Lovelace and Charles Babbage began work on machine-based operations in the 1830s and early 1840s, computer science and developing technology mostly focused on astronomy and navigation. The next real milestone would be marked by the invention of the punch card system, invented by Herman Hollerith in the 1890s to calculate US census numbers in a matter of months instead of the years it took to count by hand.
Early 20th-century computing was pushed forward by the wars and the need to develop programmable electronic computers to support the war effort. From 1943 to 1945 a group of scientists, including John Von Neumann and two professors from the University of Pennsylvania—John Mauchly and J.Presper Eckert—designed and built the Electronic Numerical Integrator and Calculator (ENIAC).
As the previously mentioned Times article describes, ENIAC weighed more than 30 tons. “Merely getting it to work was seen as the heroic, manly engineering feat,” the Times reports. Women handled the programming: Kathleen McNulty, Jean Jennings, Betty Snyder, Marlyn Wescoff, Frances Bilas and Ruth Lichterman all contributed. “The men would figure out what they wanted Eniac to do; the women ‘programmed’ it to execute the instructions.” The group hoped to use Eniac to determine rocket trajectories, but the machine was so large that WWII was over before it could be spooled up.
World War II also saw the work of British code breaker Alan Turing, who asked the question “can computers think?” His early computer model, the Turing machine, and Turing Test—now seen as one of the first efforts at artificial intelligence—were visionary contributions by this only recently honored computer scientist.
Following the knitted histories of military and computer science in the early 1950s, Grace Hopper, a Navy rear admiral, is credited with developing the first compiler program which led to the development of the first standardized computer language: COBOL (common business-oriented language); Fortran, Snobol and Lisp would follow.
Later that decade, unrelated to any US or UK military cooperation, electrical engineers Jack Kilby and Robert Noyce nearly simultaneously (and independently) came up with the idea of a miniature integrated circuit or microchip.
In the early 1960s, Purdue University developed the first computer science department and bachelor’s degree program, with faculty members working in real-time to create textbooks and punch cards for newly welcomed computer science majors.
Over the next decades the world would see the invention of the computer mouse. IBM’s many contributions included the floppy disc. Apple, Microsoft and the personal computer arose, as did home-modified artificial intelligence and robotics and the computer engineering we use in tiny hand-held supercomputers.
Currently, there are more than a dozen branches in the field of computer science, including algorithms and complexity, computer architecture and organization, computational science, graphics and visual computing, human-computer interaction, information management, intelligent systems, networking and communication, operating systems, programming languages, software engineering, distributed and parallel computing, platform-based development, social and professional issues, and security and information assurance.
MSCS graduate students from leading universities like Berkeley, Stanford, Southern Methodist University, Stevens Institute of Technology, and Tufts University can find full-time work in research labs and major tech corporations in any of these branches starting in the six-figures.
The US Bureau of Labor Statistics (BLS) reports that the next decade will be a boom time for computer science grads. Employment in this field is projected to grow 13 percent from 2020 to 2030, faster than the average for all occupations, adding about 667,600 new jobs.
Some of the top jobs in computer science include:
(Last Updated on February 26, 2024)
Questions or feedback? Email editor@noodle.com
According to the Bureau of Labor Statistics, artificial intelligence and [...]
A master's in homeland security prepares students for careers in [...]
Cyber security analysts earn around $105,000 per year on average. [...]
A Chief Information Security Officer is a relatively new designation, [...]
The BLS projects robust growth in the database administrator job [...]
Categorized as: Computer Science, Information Technology & Engineering