This is truly an "Information Age" and sometimes, you need to look at where we've been in order to see the future more clearly!
These famous words were telegraphed by Samuel F. B. Morse in 1844, although the patent for the "electric telegraph" was submitted in 1937 by Charles Wheatstone. By the time of the Civil War, telegraph communications spanned the United States, and in 1866, the first trans-Atlantic cable was laid between the US and France (Werner von Siemens from Germany was one of the pioneers in the development of reliable submarine cables).
The basic elements of Morse code were the dot ("dit") and the dash ("dah"). In International Morse code (the most prevalent of the Morse code variants), the dot was the minimal duration element, with the dash equal to three times the duration of the dot. Electrically, current flows with both dots and dashes. The time between each element of the same character was one dot. The amount of time between each character was three dots; and the time between words was equal to seven dots! During these "idle" time intervals, the telegraph line was open (no current flow).
Unfortunately, Morse code suffered from a couple of drawbacks. Skilled telegraph operators were required, and the work of these operators was grueling (Can you imagine banging on a single key all day?). Also, because the code had varying numbers of elements between characters, it was very difficult to automate. But, it still beat the Pony Express!
A Frenchman said that to me! In 1875, a Frenchman named Emile Baudot developed a code suitable for machine encoding and decoding. It consisted of 5 equal-length units (bits) and was suitable for transmitting 32 different code combinations (characters). Since it was necessary to transmit the 26 Latin (A-Z), 10 numeric, plus some control characters; two (out of the 32 combinations) special characters, "FIGS" and "LTRS", were used to select character sets (similar to the CAPS key on many computer keyboards).
This code is commonly referred to as "Baudot Code" (naturally), or ITA#2 (International Telegraph Alphabet, #2) today. In Great Britain, this code is sometimes referred to as the "Murray Code".
Unfortunately, Baudot Code was developed before the practical deployment of associated applications and equipment. As such, it did not enjoy widescale deployment until the later invention of the teletypewriter.
These famous words were spoken in Boston by Dr. Alexander Graham Bell on March 10, 1876 while working with his invention, the telephone! Alexander Graham Bell filed for a patent for this device on February 14, 1876; TWO HOURS before a similar patent was filed by Elisha Gray of Chicago. After a long legal battle, the United States Supreme Court upheld Dr. Bell's patent.
Work on a public telephone network was well underway in 1878 when the first commercial telephone exchange was brought into service in New Haven, CT. Ultimately, one of the largest companies on this Earth (American Telephone and Telegraph) was spawned. Its been noted that at one point, AT&T employed over a million people!
While the initial invention of the telephone can hardly be considered as a data communications milestone, it is included in this document because almost the entire network backbone of the US Public Switched Telephone Network (PSTN) is now digital!
The invention of the teletypewriter (a.k.a. teleprinter) occured in the early 1900s. The largest manufacturer of these devices in the United States was the Teletype Corporation. In fact, although the term "teletype" is often used to refer to such teleprinter devices it is actually a trademark of the AT&T Teletype Corporation.
These teleprinters utilized the 5-bit, 32-character Baudot Code. But because the transmission was machine generated and decoded, it was necessary to delineate the bits in a character. A bit was added to the beginning of the character, called the "Start Bit". Another bit was added to the end of each character. This bit is known as the "Stop Bit". This type of Start/Stop transmission is called "Asynchronous" communication.
Teleprinter mechanisms used the presence of DC current flow to indicate a "Mark" (logic "1") or the lack of DC current (open) to represent a "Space" (logic "0"). In an idle state, constant DC current flow exists. When an open state is present, the receiver detects this as a "Space" and prepares to receive a Baudot character. After the character is received, the Stop Bit ensures that the line is returned to an idle state. This method of DC communication represents what is known as a "Current Loop" interface. Multiple parties can easily be "bridged" onto a single line, and line open conditions result in a noticable constant spacing condition (chattering teleprinter).
You don't see many teleprinters nowadays; osoleted by today's computer printers and visual displays. But widescale use of the teleprinter lasted for over 50 years! Alas, some companies lived and died with teleprinter technology (e.g. Western Union).
Teleprinter transmission technology created the "Tape Punch" and "Tape Reader" devices. Why is this significant? Because it advanced the creation of the first "Store and Forward" data messaging systems.
Teleprinter messages could be received on tape, then resent or broadcast to other teleprinters by using the tape reader. If there were errors in the transmission, the tape could be resent.
Data messaging networks evolved, to allow individuals to communicate with each other in a digital format. Telex and TWX were examples of these early messaging systems. Can anyone remember seeing those TWX numbers on business cards? I can recall hearing the ol' teleprinter chattering away occasionally at our Timeplex office in Largo, FL back in 1984.
In the 1960s, significant advances in data communications character coding resulted in the development of 8-Bit characters. In 1962, IBM created and promoted, a coding standard known as Extended Binary-Coded-Decimal Interchange Code, or EBCDIC for short. This coding scheme defined 8-bit characters, allowing up to 256 characters to be used. While the world probably would have been better off with a pure 8-bit code, another standard called the American Standard Code for Information Interchange (ASCII) was adopted in 1963 and ultimately won the standards battle.
ASCII was first defined by the American National Standards Institute (ANSI) in ANSI Standard X3.4 in 1968. The ASCII code is also described in ISO 636 (1973) and CCITT V.2, which calls the standard IA5 (International Alphabet #5). ASCII is a 7-bit code, resulting in a maximum of 128 characters. However, ANSI Standard X3.16 (1976) and CCITT Standard V.4 describe the use of an additional eighth bit as a "Parity Check" bit. This bit gets set such that the sum of the 7-bit character is either ODD or EVEN. As such, single transmission bit errors within a character can be detected! As described in the aforementioned specifications, EVEN parity is suggested for use on asynchronous communications systems and ODD parity used in synchronous systems!
In reality, applications may implement EVEN parity, ODD parity, NO parity, always MARK parity, or always SPACE parity. Parity setup problems have been a source of aggravation for anyone who dials into different BBSs. Even in today's modern Internet culture, one must be cognizant of the proper setup for 7-bit or 8-bit FTP (File Transfer Protocol) transmissions!
Additional expansion of the code using the ESCAPE character was defined in ANSI X3.64-1979 in an effort to "standardize" graphic character representations and cursor control. In fact, the DOS operating system is based upon a 256 character set, and ANSI graphic characters have representation through the extra 128 characters that the DOS system allows.
The Serial transmission of ASCII is defined in ANSI X3.15 - X3.16 and CCITT V.4 and X.4. A start and stop bit are added to the character to delineate the character for asynchronous transmission, as in Baudot Code. However, synchronous transmission of ASCII is also defined.
Say what? First conceived in 1937 by Alex Reeves, a voice digitization technique known as Pulse Code Modulation started to be deployed in the United States Public Switched Telephone Network in 1962.
Basically, you start with a 4 KHz analog voice channel. Then you take a "snapshot" of the voice signal's amplitude every 1/8000th of a second (you have to sample at twice the maximum frequency to avoid a problem known as "aliasing"). Then you convert the measured amplitude to a number (the "quantization" process) that is represented by 8 bits. Thus, PCM requires 64 KBPS of digitial bandwidth (8 KHz * 8 bits). This basic channel represents the first level of a digital heirachy, known as a DS0.
A special type of Time-Division Multiplexer (TDM) called a "Channel Bank" takes 24 of these 64K DS0 channels and combines (multiplexes) them into a single aggregate rate of 1.544 MBPS. This rate is the combination of the channel data payload of 1.536 MBPS (64 KBPS * 24 Channels) + 8 KBPS of framing and synchronization bits. The 1.544 MBPS rate is known as the DS1 level in the digital hierarchy. Facilities that support this rate are usually referred to as "T-Spans" or "T1" circuits.
International standards were developed later. Although the basic heirarchial DS0 rate of 64 KBPS was preserved, the algorithm for converting the voice signal to a digital signal is different. Also, the International standard calls for 30 voice channels + a 64 KBPS synchronization channel + a 64 KBPS signaling channel. Therefore, these systems operate at a rate of 2.048 MPBS (1.920 MBPS + 64 KBPS + 64 KBPS). Facilities that support this rate are usually referred to as "E1" circuits.
Using a transmission line code known as Bipolar-Alternate Mark Inversion (AMI), a 1.544 MBPS T1 circuit requires 772 KHz of analog bandwidth. So, why go digital? I could use Frequency Division Multiplexing (FDM) and combine those same 24 channels into a 96 KHz (4 KHz * 24) analog pipe, right? While FDM saves bandwidth, noise is added as the signal travels through every amplifier and modulator. In a digital system, "ones" and "zeroes" go in, and "ones" and "zeroes" go out. Since major sources of analog noise are removed in digital systems, circuit lengths can be extended, and network topologies simplified through the reduction of the number of circuits required between any two telephone exchanges. Quality improves, operating costs decrease!
In addition to the development of 8-bit communication codes and Pulse Code Modulation systems, the sixties brought forth a number of other significant contributions.
The deployment of digitial transmission facilities resulted in the development of standard digital hierarchies, as noted in the previous Pulse Code Modulation section.
Integrated circuit (IC) development created Large Scale Integration (LSI) IC technology.
CRT terminals, developed in the 1950s, saw increased use as the preferred I/O device for computer systems. Computer architectures changed to accomodate interactive I/O.
The first communications satellites are launched.
The Carterphone decision in 1968/1969 allowed devices which were beneficial and not harmful to the network to be connected to the PSTN. This spawned the development of many modem and data communications companies!
Dataphone Digital Service (DDS) started deployment in 1974, bringing digital transmission facilities to the customer's premise. DDS circuit deployment also accelerated the conversion to digital networking within the Bell System.
X.25 began widescale deployments at the end of the 70s, introducing packet switched networking. Large X.25 public networks evolved; such as Telenet (now "Sprintnet") and Tymnet.
The continued development of Integrated Circuits results in widespread availability of LSI and VLSI (Very Large Scale Integration) devices, increased reliability, and decreased costs.
During the 1980s, the development of Dial Modem technology accelerated at a frantic rate.
On January 1, 1984, AT&T divested itself of its 22 Bell System operating companies based upon a 7 year antitrust suit filed against AT&T by the U.S. Department of Justice, and an agreed upon settlement. Ultimately, the Bell Operating Companies ("BOC"s) were grouped together into seven Regional Bell Operating Companies ("RBOC"s):
- Ameritech Corporation - Bell Atlantic Corporation - Bell South Corporation - Nynex Corporation - Pacific Telesis Group - Southwestern Bell Corporation - US West Incorporated
AT&T itself, now divested, consists of two basic organizations:
- AT&T Communications: Provides long distance services. Provides inter-LATA and network services. - AT&T Technologies: AT&T Bell Labs AT&T International AT&T Information Systems AT&T Network Systems
Divestiture caused the carriers to compete in the only unregulated area; business communications services. This resulted in an explosion in business communications, starting with the availability of T1 (1.544 MBPS) services in 1984.
Multiplexing vendors launched new, network-savvy, Time Division Multiplexers. Company networks consolidated voice and data circuits into single high-speed aggregate bit streams; saving money and manpower, while improving network survivability. These new "microprocessor muxes" offered features such as redundancy and automatic circuit rerouting while supporting a wide variety of data and voice I/O types.
Local Area Network deployment accelerated, offering users a new view of data communications networking; the ability to access anything from anywhere, bandwidth-on-demand for data transfers, standardized connectivity, etc. The migration of computer networks has shifted from "Centralized host" to "Client-Server" architectures.
Signaling System #7 (SS7), a digital switch protocol used in the PSTN, began widescale deployment in the US PSTN. Sweden was among the first countries to implement SS7 networking while Bell Atlantic was among the first Local Exchange Carriers (LECs) to complete SS7 network implementation. This offered additional CLASS (Customer Local Area Signaling Services) services: Automatic Callback, Automatic Recall, Computer Access Restriction, Distinctive Alert, Caller ID, Selective Call Acceptance/Blocking, etc.
After the completion of SS7 within the PSTN backbone, additional telephone networking services were offered to business customers. Particularly, enhanced PBX network services such as Virtual Private Networks (VPNs) evolved. These services allowed flexible dialing for business users, and allowed the carrier to integrate Public and Business communications throughout the carrier's SS7 network.
The attractive Virtual Network options for voice services, combined with continued cost reductions in T1 services, have resulted in the segregation of voice and data in the Wide Area Network (WAN). As such, a "new" standard, known as Frame Relay, began deployment. Frame Relay is particularly adept at transporting LAN and X.25 traffic, and Public Frame Relay transport services are available from many carriers.
Wireless communications system use has exploded, with dramatic growth in Cellular voice and data technologies. Of particular interest are the merger of AT&T and McCaw Cellular (Cellular 1) and the development of the Cellular Digital Packet Data (CDPD) standards. Additional frequency allocations have recently occured for the development of wireless Personal Communications Systems (PCS), in an unprecedented spectrum auction by the Federal Communications Commission (FCC).
Slow, but steady increases are seen in the use of Integrated Services Digital Networks (ISDN); providing higher speed digital access capabilities to the residence and businesses.
New methods of integrating voice and data, as well as Local Area and Wide Area networks, are under development. These new "cell-based" transmission technologies are known as Switched Multimegabit Data Service (SMDS), Asynchronous Transfer Mode (ATM), and Broadband ISDN.