In the emerging economy there is a new infrastructure, based on the internet, that is causing us to scrutinies most of our assumptions about the business. As a skin of networks - growing in ubiquity, robustness, bandwidth, and function - covers the skin of the planet, new models of how wealth is created are emerging.

Showing posts with label microprocessor. Show all posts
Showing posts with label microprocessor. Show all posts

Monday, January 19, 2015

Optical character recognition (OCR)

Systems for recognizing machine-printed text originated in the late 1950s and have been in widespread use on desktop computers since the early 1990s. It stills an active area of research because the problem is complex in nature.

OCR is the field of pattern recognition, image and natural language processing. OCR technology has advanced to the point where today’s systems are indeed useful for processing a large variety of machine-printed documents. Accuracies of 99% of more are routinely achieved on cleanly-printed pages.

Optical character recognition (OCR) is a process of scanning print pages as images on a flatbed scanner and then using OCR software to recognize the letters as ASCII text. It is a technology that involves reading typewritten, computer printed or hand-printed characters from ordinary documents and translating the images into a form that the computer can process.

The OCR software has tools for both acquiring the image from scanner and recognizing the text.

OCR works best with original or very clear copies and mono-spaced fonts like courier. For a good OCR one should use 12 point or greater font size.

With the continued advancement in microcomputer technology, further improvements in OCR can be expedited, OCR machines using dedicated microprocessor will be able to achieve greater speed and therefore be more effective in satisfying traditional OCR applications.
Optical character recognition (OCR)

Friday, March 5, 2010

The first microprocessor

The first microprocessor
In the mid 1940s, John Von Neumann, a brilliant mathematician at Princeton University, conceived a theoretical machine in which binary logic and arithmetic could work together in storing detailed programs and performing complex calculations.

Von Neumann demonstrated that one could encode instructions to the machine in the same language used for the data it processed.

This great advance meant that a computer could read instructions, accept data, perform calculations, and store results all in a single code.

These ideas pointed the way toward the design, construction and operation of units that can be employed separately or combined for greater and flexibility.

They also focused attention on the newly developed integrated circuits in which very small, highly reliable components could store and process digital information.

Finally, Von, Neumann’s and receiving information to and from other computers. The modern computer network incorporating many diverse computing elements is one outcome.

In the 45 years since ENIAC, computers have become bigger, faster and more versatile. They pervade all aspect of business, government, communication, education, and science.

The last 25 years have seen an equally rapid development of the smallest computers the so-called microcomputers.

It started in 1969 when Marcian E. Hoff of the Intel Corporation was working with a group of Japanese engineers designing the logic for a family of calculators.

To avoid the complexity of their approach, he proposed “a general purpose computer programmed to be calculator.”

Hoff and Stanley Mazor then developed the architecture of an IC computer.

In 1973, Intel filed for a patent on Hoff’s device and the patent as issued in 1974. So called “single chip microcomputers” are completely contained in one integrated circuit. In general however, a microcomputer consist of a microprocessor as the control unit and several other IC chips providing memory and data handling functions.

Those first microprocessors were too feeble to power anything resembling a personal computer, But they were and still are powerful enough to drive pocket calculators and control machines performing simple tasks. They have become, like the motor, a tool for every use. Millions are sold each year to operate home burglar alarms, remote television controllers, programmers for videocassette recorders, and dozens of toys,
The first microprocessor

The most popular articles

My Blog List