1980sDecade of larger-scale LSI circuits
In the 1980s, Japan overtook the U.S. in the race to increase memory capacities. In addition, as the CMOS technology which had been accumulated in the consumer LSI field was applied to memory and microprocessors, Japan was also able to overtake the U.S. in the shift to CMOS.
Many new products containing semiconductor devices were being introduced to the market – including personal computers, dedicated word processors, video-game consoles, and fax machines – and LSI circuits providing new functions were created to support the development of such products.
1980s: DRAM capacity increases, the shift to CMOS advances, and Japan dominates the market
In the 1980s, Japanese manufacturers took the lead in the competition to increase DRAM capacities.
Japanese manufacturers also surpassed U.S. companies in product quality; this was a golden age for these manufacturers. To increase capacities, three-dimensional structures such as trench structures and stacked structures were introduced to memory cells, making 1-Mbit and then larger memory products possible.
The shift to CMOS also continued, and the first CMOS DRAM products were released as the second generation of 256-Kbit DRAM; all mass-produced 1-Mbit and larger memory devices were CMOS.
1980s: SRAM capacities increase
SRAM capacities also increased in the 1980s, first from 16 Kbits to 64 Kbits, then to 256 Kbits, and next to 1 Mbit. The field of application for SRAM, however, became gradually limited by the adoption of CMOS DRAM to small low-power electronic products and a decrease in products that were dedicated to specific purposes as the use of personal computers became increasingly widespread.
1980s: Evolution of nonvolatile memory
In the EPROM field in the 1970s, Japanese companies lagged behind Intel and TI, both of whom were rapidly increasing EPROM capacities.
In the 1980s, however, Japan caught up with the pace of increases in capacity. EPROM also shifted to CMOS at about the same time as DRAM, and Japan overtook the U.S. in the shift to CMOS.
The use of mask-programmable ROM also spread because of its low cost, and capacities of this type of memory also increased.
1980s: Advanced CAD tools and the shift to the EWS
As ever larger-scale systems were integrated in LSI circuits, computer-aided design (CAD) tools became more important and semiconductor manufacturers developed more advanced CAD tools to run on mainframe computers.
Before long, with the development and spread of commercial electronic design automation (EDA) tools, the engineering workstation (EWS) replaced the mainframe computer as the platform for running these tools.
1980s: EDA tools for ASIC appear
With the introduction of ASIC devices in gate-array technology and standard cell technology (cell-based ICs) and then the increasing scales of such circuits, placement and routing design had to be automated.
Japanese manufacturers developed their own automatic placement and routing tools or used routing tools from EDA vendors in the U.S. The automatic layout tools of EDA vendors became more advanced and the EDA industry was established.
1980: Single-chip digital signal processor (NEC)
In 1980, NEC developed the world’s first digital signal processor (the PD7720).
This processor integrated almost all functions necessary for signal processing. It incorporated a 16-bit parallel multiplier and was used in a variety of signal processing devices for modems, voice compression, voice recognition, and image processing.
1981: Microprocessor shifted to CMOS (Hitachi)
In 1981, Hitachi developed the HD6301 CMOS microprocessor.
This was software-compatible with Motorola's 6801 8-bit microprocessor, an industry standard at the time.
Hitachi’s CMOS microprocessor was designed to provide top performance by using the latest semiconductor technology in hardware while ensuring software compatibility. The business strategy was to differentiate hardware while preserving software compatibility.
1983: V30 16-bit microprocessor (NEC)
In 1983, NEC developed the V30 16-bit microprocessor by using CMOS process technology. This microprocessor provided upward compatibility with the Intel i8086, a de facto standard 16-bit microprocessor.
It was widely used in office automation (OA) equipment, such as personal computers, word processors, fax machines, and printers, as well as in switching equipment and industrial control equipment.
1984: The ImPP, a non-von Neumann data-driven processor (NEC)
In 1984, NEC developed a data-driven processor, the PD7281 ImPP (image pipelined processor).
This was the world’s first to use the variable-length pipeline method and had a completely different architecture from conventional computers.
The processor greatly contributed to the rise of the digital image processing market at that time.
1984: Microcontroller with on-chip EPROM (Hitachi)
In 1984, Hitachi developed and released a single-chip microcontroller incorporating EPROM instead of conventional mask-programmed ROM.
Although ensuring the quality of microcontrollers with on-chip EPROM was difficult, the quality of this product fully satisfied the requirements for use in mass production by customers.
Before this product, software had to be written to the mask-programmed ROM during the fabrication process of the semiconductor manufacturer; customers were now able to write the software to memory themselves, markedly reducing the time required by customers from software development to mass-production.
1984: Flash memory released (Toshiba)
Fujio Masuoka of Toshiba presented flash memory at IEDM in September, 1984. This form of flash memory had a one-transistor cell structure formed through three-layer polycrystalline silicon technology and used an electrical chip erase method. This memory had greater capacity and cost less than Intel’s FLOTOX-type EEPROM and thus became the standard for nonvolatile memory. After this, flash memory progressed in two types: NOR and NAND.
Early 1980s: Gate-array and standard-cell ASICs are released and evolve
The 1970s were a decade of general-purpose-oriented memory and microprocessors, but custom-oriented gate-array and standard-cell ASICs evolved in the 1980s.
The standard cell (cell-based IC) technology of the building-block type was later developed into SoC (system on a chip) technology.
Japanese manufacturers took the lead in the gate-array ASIC field with their advanced fabrication processes.
1985: FPGA (Xilinx, Inc., U.S.A.)
Gate-array products had a disadvantage in that they had to be handled as custom products from the wafer-fabrication processes because programming was by fixed connections in the metal layer.
In 1985, Xilinx released the field-programmable gate array (FPGA), which was programmable by the user after shipment.
In this device, logic blocks that can be used to configure any desired logic circuits are placed in grids and wiring between the blocks can be programmed by the user.
1985: IC card microcontroller with on-chip EEPROM (Hitachi)
After developing the first microcontroller with on-chip EPROM, Hitachi developed a microcontroller with on-chip EEPROM for use in IC cards. EPROM uses ultraviolet light and thus cannot be used in IC cards.
Also, the ability to update data in byte units is desirable, so EEPROM was adopted. This EEPROM was an MNOS (metal-nitride-oxide-silicon) type.
1988: 32-bit CISC microprocessors based on the TRON specifications (Hitachi, Fujitsu, Mitsubishi Electric, etc.)
In 1984, Ken Sakamura of the University of Tokyo proposed TRON (The Real-time Operation systems Nucleus) architecture.
Based on this concept, the TRON project started through cooperation between universities and industry.
Through this project, Gmicro-series 32-bit CISC microprocessors based on the TRON architecture were developed, first started by Hitachi in 1988, and then by Fujitsu, Mitsubishi, and other companies.
1989: LSIs for analog high-definition television (MUSE system) (various Japanese manufacturers)
In the 1980s, Japan’s analog high-definition MUSE (multiple sub-Nyquist sampling encoding) television system was developed under the leadership of NHK.
As the first generation of MUSE LSI circuits for receivers, 25 dedicated LSIs were developed in 1989 by sharing the task among NHK, multiple home-appliance manufacturers, and semiconductor manufactures in Japan.
After that, a second-generation of LSI circuits was developed by multiple groups of home appliance and semiconductor manufacturers in Japan from 1992 to 1994. Single-chip LSIs integrating all of the necessary functions were produced from 1995.
Late 1980s: HDLs and logic synthesis tool released
After the Verilog and VHDL hardware description languages (HDLs) for describing digital circuits were released, Synopsys released a logic synthesis tool, Design Compiler, in 1986. The languages and tools dramatically changed the processes and techniques used in designing logic LSIs.