Tech Topic: The History & Future of Computers

The History of Computers

      According to Professors Frank Vahid & Susan Lysecky, who developed ZyBooks, “Computing Technology for All”, computers originated from telephone switches in the 1900's. Engineers at the time discovered that switches could perform calculations using the positions of the switches based on meaning such as 1 for ON, and 0 for OFF. In the 1940's the first of these computers were built and required were large and only available in certain areas of the world for computing large amounts of data. In 1943, the ENIAC was created as the first U.S. general purpose computer by the Army’s Ordnance Proving Ground. According to Professor Maarten Bullyok, it’s intended purpose was to computate ballistic trajectories, but it proved to be capable of much more.  The idea of having a personal computer was far-fetched due to the laborious tasks of operating them because they had to be kept in air-conditioned rooms that required special wiring and outlets. Regarding the future of computers, there is development for what is called a Quantum-Computer. This innovation is intended to solve more complex issues, quicker than before by combining 1's & 0's within what is called a quantum bit. The video below, created by Forbes, explains in depth the benefits of this development. 

Future Developments: The Quantum-Computer 

Hardware & Software

With further advances of the decades, and the need for a smaller yet complex computing system, the computer chip was created in the 1970’s. Prior to its creation, computers were run off of thousands of switches that occupied entire large rooms to a handheld chip that could deliver the same capabilities & more. This trend is known as Moore’s Law, where switch sizes halve every 2 years. This was a huge factor in developing the computers we have today in terms of size & in complexity. In comparison to the computer chips debut, where it could only hold central processing unit to run program instructions (1’s & 0’s), CPUs of today can hold anywhere from two, four, eight or more CPUs. As a result, in today and future trends we have smaller, thinner computers, that provide east to transportation, data management & processing times.

Programming Languages

 The shift from mechanical systems to electronic computing led to the first major advancements, laying the groundwork for future developments. In this period, the focus was on automating calculations and processing data; concepts that are foundational to IT today, especially in terms of computational power. In the 1960’s & further back, programming languages were lower level in comparison to today’s standards and were solely used to provide straightforward numerical computations. Internet pioneer, Steve D. Crooker stated in an article titled “Arpanet & Its Evolution” that “programming was tedious and prone to errors. In a batch processing environment, the programmer might have to wait several hours or even a full day to get results back, only to discover there was an error in the program”. This issue would drive the innovations of interpretive programming languages such as LISP & BASIC.  American computer scientist, Grace Hopper worked as a technical consultant that defined the programming language, COBOL and introduced it as an English-based programming language designed for business use for financial record keeping and is still used today. ALGOL (algorithmic language) was used for mathematics within scientific computing were some of the first higher level programming languages that were typically only used by engineers or business agencies at the time.  Languages of today such as Python, & C++ use variables some of which are inferred meaning you do not have to type them specifically, it just depends on the assigned value.

Computer Applications

            In 1975, Bill Gates pioneered the creation and debut of Microsoft. This computer company also created Microsoft Windows applications leading into the 1980’s. Thus, in 1983, he developed the concept of having a graphical user interface (GUI), which allows computer programs and files to be represented by icons and other graphics on the screen. Though unsuccessful at first, the 1990s saw a transformation in software applications, most notably being the creation of Word Processor, Excel, & PowerPoint. A word processor is based on formatted text input that uses a cursor, like in most applications, in order to indicate where a user’s input will be next. It can also support the insertion of columns, drawing, images, as well as marginal, & text formats unlike applications known as text files, that can only store text & do not support formatting.

The story of the first digital spreadsheet software creation, VisiCalc, before Microsoft Excel.

            Before the creation of Microsoft Excel in 1985, Harvard students, Dan Bricklin, & Bob Frankston created VisiCalc. The was the first digital application to debut for computers & was the influence of Microsoft Excel & Google Sheets. A spreadsheet application is used for displaying data tables or charts of data in an organized manner using alphabetically & numerically numbered columns. Each column & row are comprised of cells. The data that is inputted can be manipulated using sorting or categorizing of what is input into the cell or by color.  Formulas are used to quantify the data that is inputted. Mathematical necessities are supported by this function by highlighting the needed cell information and clicking the available mathematical icons to formulate the information, either by Sum, Average, or finding the minimum value within the selected cell. 

 PowerPoint, formally known as Filemaker, for the Mac version at its origin in 1984, is an application that uses text & graphics including animation to display on slides instead of digital paper like Word. Various editing tools are available similar to Word & Excel such as formatting text, & alignment. Presentation however provide an array of themes that can be applied & combined with formatting capabilities. 

Currently AI has been a leading innovation in computer applications, for example Microsoft 365 Copilot. Copilot is a new application that uses AI within various communication applications such as Outlook, TEAMS, Excel, Word & PowerPoint to rewrite, transform, or gather analytics across multiple applications. Copilot offers the ability to turn data that was created in Word Processor into PowerPoint slides within seconds, or rewrite emails to make them sound more professional, or direct. 

Database Management

        In 1890, Herman Hollerith, an inventor, created the first punched card tabulating machine.  The tabulating machine was used to compile statistics for the U.S. Census. This is an early example of the early age of what would become databases. According to professors Frank Vahid & Susan Lysecky, data such as gender, age, and residence were stored on punched cards. Because of the tabulating machine, the census tabulation process was completed two years faster than usual. The machine's success enabled Hollerith to sell tabulating machines to many nations. More than 10 different nations used machines for their year 1900 census. Today, by definition data is a collection of values that are translated into readable information. Having a database means having the ability to access material through organized methods, for example how a retail store keeps inventory, or how a hospital logs & retrieves patient information online. Given the large amounts of data that are in constant movement across the globe, and need for transparency, having a secure Data Based Management Systems or DBMS is essential to meet various global needs. Server farms, for example, were created in order to store, process, & manage large amounts of data efficiently. This offers physical storage for the database, that resides in storage servers via hard drives (HHD's) or solid-state drives (SSDs). Data is replicated to provide availability & loss recovery in certain instances. Many companies use databases located at server farms such as Google Cloud, & Microsoft Azure, this method is called cloud hosting. This enables various forms of data to be retrieved by anyone with access around the world to information, photos, & media

Security of Computers

In the earlier ages of computers, hacking did not hold the same meaning it does today. Because of how scarce and difficult it was to own a computer, only a few like-minded individuals knew how to operate them in most cases. Most computers were held in a lockable room, where people would attempt to break in out of the sheer curiosity of them. In the 1970’s, when the expansion of computers began, information started being secured with the use of passwords, but the discussions of computer security started to arise when the first virus called “Creeper” was created by BBN Technologies engineer, Bob Thomas. Creeper was regarded as the first “known computer virus that could move about in the ARPANET (The Advanced Research Projects Agency Network), which was the precursor to the Internet. Dr. Eric Cheng, a professor for early child education wrote an article on Institutional Strategies in Cyber Security & stated that, "Creeper was subsequently made to move across the ARPANET and the self-replicating Creeper was deleted. The conflict between the two programs exposed the network vulnerability of ARPANET and though regarded as harmless at the time, it raised the issue of network security". 

Teletype message of the test virus "Creeper"

Today, even though cybersecurity is much more advanced than before, issues still exist & are referred to by some as security holes. The term "security hole" is in reference to a computer or other device that is at risk of a security breach. A modern method of security today uses digital certificates. According to information found in the Zy Books created Professors Frank Vahid & Susan Lesecky, most data is exchanged over the Internet using public keys which is a form of data encryption. The issue with this is that public keys are available to anyone allowing hackers to install malicious software into one’s computer with the intent to extort their sensitive data. A digital certificate combats this and authenticates the public key, confirming that the data originated from a legitimate source which provides a form of surveillance for data. Software’s such as anomaly-based intrusion detection systems (IDS) are widely used today in cyber security. 

According to a scholarly article titled "Modern Cyber Security & Advanced Machine Learning", collaboratively written by various professors of the IT industry, this mixture of advanced technology with current measures can assist to identify anomalies in normal system behavior. The future of cybersecurity lies with the combination of current security methods with advanced machine learning referred to as device mastering (ML).  With the growing complexity of technology, cyber threats & attacks also increase this need. This will provide improved heightened detection of vulnerabilities, & automation of safety to make users aware of vulnerabilities in an effort to decrease human error. As well as faster reaction time to threats, & facts visualization that will help security teams weed out threats within large amounts of data more efficiently & accurately in real time. 

Comments

Popular Posts