top of page

                                                                        

October has not been a good month for the world of technology where untimely departures are concerned. Moreover, following on from the death of Apple chief executive Steve Jobs, influential computer scientist Dennis Ritchie has also lost his battle with a prolonged illness.

 

Ritchie, who was born in Bronxville, New York, in 1941, was perhaps best known in the field of computer development for his contributions towards the creation of both the Unix operating system and the landmark ‘C’ programming language.

 

In terms of assessing Ritchie’s legacy, the C programming language—which was created almost 40 years ago—remains in widespread use today and is commonly utilized by those working to build websites.

 

It’s also worth noting that the existence of Ritchie’s C language inspired the future creation of C++ and Java, both of which are hugely popular tools today.

 

In 1983, Dennis Ritchie and his development partner Kenneth Thompson were awarded the coveted Turing Award for their work on the Unix platform.

 

His death was announced on Thursday of this week by Alcatel-Lucent’s Bell Labs, where he had been a mainstay contributor from the early 1960s through to his retirement in 2007.

 

“Dennis was well loved by his colleagues at Alcatel-Lucent Bell Labs, and will be greatly missed,” said company president Jeong Kim in an official statement. “He was truly an inspiration to all of us, not just for his many accomplishments, but because of who he was as a friend, an inventor, and a humble and gracious man.”

Dennis Ritchie 1941-2011

Unix creator dies: Computer industry loses another leading light

by Steven Mostyn – Oct 14 2011, 09:55

John McCarthy 1927-2011

Artificial intelligence community mourns John McCarthy

BBC – 25 October 2011 Last updated at 13:45 ET

Artificial intelligence researcher, John McCarthy, has died. He was 84.

 

The American scientist invented the computer language LISP.

 

It went on to become the programming language of choice for the AI community, and is still used today.

 

Professor McCarthy is also credited with coining the term “Artificial Intelligence” in 1955 when he detailed plans for the first Dartmouth conference. The brainstorming sessions helped focus early AI research.

 

Prof McCarthy’s proposal for the event put forward the idea that “every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it”.

 

The conference, which took place in the summer of 1956, brought together experts in language, sensory input, learning machines and other fields to discuss the potential of information technology.

 

Other AI experts describe it as a critical moment.

 

“John McCarthy was foundational in the creation of the discipline Artificial Intelligence,” said Noel Sharkey, Professor of Artificial Intelligence at the University of Sheffield.

 

“His contribution in naming the subject and organising the Dartmouth conference still resonates today.”

 

LISP

Prof McCarthy devised LISP at Massachusetts Institute of Technology (MIT), which he detailed in an influential paper in 1960.

 

“The invention of LISP was a landmark in AI, enabling AI programs to be easily read for the first time,” said Prof David Bree, from the Turin-based Institute for Scientific Interchange.

 

“It remained the AI language, especially in North America, for many years and had no major competitor until Edinburgh developed Prolog.”

 

Regrets

In 1971 Prof McCarthy was awarded the Turing Award from the Association for Computing Machinery in recognition of his importance to the field.

 

He later admitted that the lecture he gave to mark the occasion was “over-ambitious”, and he was unhappy with the way he had set out his new ideas about how commonsense knowledge could be coded into computer programs.

 

However, he revisted the topic in later lectures and went on to win the National Medal of Science in 1991.

 

After retiring in 2000, Prof McCarthy remained Professor Emeritus of Computer Science at Stanford University, and maintained a website where he gathered his ideas about the future of robots, the sustainability of human progress and some of his science fiction writing.

 

“John McCarthy’s main contribution to AI was his founding of the field of knowledge representation and reasoning, which was the main focus of his research over the last 50 years,” said Prof Sharkey

 

“He believed that this was the best approach to developing intelligent machines and was disappointed by the way the field seemed to have turned into high speed search on very large databases.”

 

Prof Sharkey added that Prof McCarthy wished he had called the discipline Computational Intelligence, rather than AI. However, he said he recognised his choice had probably attracted more people to the subject.

Jacob Goldman 1921-2011

Founder Of The Company That Inspired Steve Jobs, Dies At Age 90

By Alex Heath (2:36 pm, Dec 22)

Many have called Steve Jobs the father of modern computing, but some would argue that the true credit goes to Jacob Goldman, founder of Xerox PARC. Under Goldman’s guidance, Xerox become responsible for the technology that inspired Steve Jobs to create computers like the Lisa.

 

The New York Times is reporting that Jacob Goldman passed away this week at the age of 90. He was Xerox’s chief scientist and founder of the Xerox Corporation’s Palo Alto Research Center — the very place Jobs took his team in December of 1979 to get a demonstration of the technology that drove him to create the first successful personal computer.

 

The research lab Goldman founded in the 1970s was responsible for many technological breakthroughs that have influenced Apple and its competition, including the graphical user interface, laser printing, and the Ethernet office network.

 

While Xerox PARC pioneered the age of modern computing in many ways, it failed to implement its inventions successfully. In Walter Isaacson’s biography of Steve Jobs, the Apple co-founder was quoted as saying that “Xerox could have owned the entire computer industry.” The company failed to execute its ideas, and Jobs told Xerox employees that they were “sitting on a gold mine” before he took the ideas at Xerox PARC, built upon those ideas, and created a successful computer company that is now valued at over $370 billion.

 

Apple would not be what it is today if it wasn’t for Jacob Goldman.

Jack Tramiel 1928-2012

The Anti-Steve Jobs Dies: So Long

By Harry McCracken | @harrymccracken | April 9, 2012

Jack Tramiel, the antithesis of Steve Jobs, has died. Tramiel was the founder of Commodore. Unlike Jobs, Tramiel believed that computers should be utilitarian and cheap, disregarding elegant design or attention to detail—like the legendary Commodore 64.

 

While Jobs’ sense of aesthetics and obsessive detail permeated everything Apple did, from hardware to software, Tramiel—born Jacek Trzmiel in Lodz, Poland, 1928—didn’t give a damn. His only concern was price and making things useful enough to win the battle in the marketplace.

 

As a result, Commodore’s design was the crude club to Apple’s elegant sword. And while time and nostalgia have made his computers charming, they are still slabs of ugly plastic. Charming ugly plastic slabs that I still like—I used the C64 all through my middle school years and remember to love every bit of its craptastic no-frills nature.

 

Tramiel’s company started as a typewriter repair company, then started to make calculators and LED watches and, finally, computers: the PET 2001—made in 1977 to look like a 1990 point of sale cashier or a sci-fi B-movie computer—and then the Commodore VIC-20.

 

But it was the $595 Commodore 64 that won the battle for him. The computer became incredibly popular. Ironically, Tramiel was forced to step down from the company he created soon after the C64—just like Jobs himself and the Macintosh. This led Tramiel to buy Atari—double irony, Atari was Jobs’ first employer.

 

In Atari, he came out with the Atari ST, which competed against the Commodore Amiga—which was created after he left his company—the Apple Macintosh and the IBM PC. Those who knew him say he was a very nice man. Rest in peace, Jack Tramiel. I’ll play Ghostbuster in my C64 emulator today to honor your achievements.

Douglas Engelbart 1925-2013

Computer mouse inventor Douglas Engelbart dies

By CNN Staff updated 2:34 PM EDT, Sun July 7, 2013 |

Douglas Engelbart, whose invention of the mouse transformed the way people interact with computers, has died.

 

Engelbart died Tuesday night at his home in Atherton, California, SRI International -- the research institute where he once worked -- said in a statement. He was 88.

 

"Doug's legacy is immense — anyone in the world who uses a mouse or enjoys the productive benefits of a personal computer is indebted to him," Curtis R. Carlson, SRI's president and CEO, said in a written statement.

 

Decades ago, Engelbart came up with the idea we now know as a mouse.

 

His first prototype, which featured a carved out wooden block, wheels and a tiny red button, looks quite different from the sleek plastic designs now seen in homes and offices around the world.

 

A radar technician during World War II, Engelbart worked at the Stanford Research Institute during the 1960s. It was there that a vision of people sitting in front of a video screen, interacting with a computer, came to him.

 

"I knew enough engineering and had enough experience as a radar person to know that if a computer can punch cards or print paper, it can draw anything you want on a screen," he told CNN in 1997 after receiving a $500,000 prize for American innovation.

 

Engelbart invented and patented what he called the "x-y position indicator," receiving a $10,000 check for the invention. He told CNN he couldn't recall who on his team had decided to call it a mouse.

 

At the time, it wasn't easy to convince fellow scientists to follow his vision, Engelbart said. But he persisted.

 

Later, he went on to found the Doug Engelbart Institute, a nonprofit dedicated to boosting the collective ability to solve complex, urgent problems on a global scale.

 

"Sometimes I reflect on how naive somebody has to be in order to get visions -- and plug away at them -- that ultimately proceed, and how many other people with visions that are as naive just fall off the cliff," Engelbart told CNN in 1997.

 

In addition to the computer mouse, Engelbart's work at SRI from 1957 to 1977 helped develop tech innovations such as display editing, online processing, linking and in-file object addressing, use of multiple windows, hypermedia, and context-sensitive help, the institute said.

 

"Doug was a giant who made the world a much better place and who deeply touched those of us who knew him," Carlson said. "SRI was very privileged and honored to have him as one of our 'family.' He brought tremendous value to society. We will miss his genius, warmth and charm."

 

Engelbart is survived by his wife and four children.

Only an email away

     recoverdoctor@gmail.com
bottom of page