From 8-bit micros to modelling the brain
Sunday, July 30, 2023 at 8:01PM
Roy Rubenstein in Acorn Computers, Amulet, Arm, Asynchronous logic, BBC Micro, Loihi neuromorphic processor, Professor Steve Furber, University of Manchester, neural network

Part 1: An interview with computer scientist, Professor Steve Furber

Steve Furber is renowned for architecting the 32-bit reduced instruction set computer (RISC) processor from Acorn Computer, which became the founding architecture for Arm.

Arm processors have played a recurring role in Furber's career. He and his team developed a clockless - asynchronous - version of the Arm, while a specialist Arm design has been the centrepiece building block for a project to develop a massively-parallel neural network computer.

Professor Steve Furber

Origins

I arrive at St Pancras International station early enough to have a coffee in the redeveloped St Pancras Renaissance London Hotel, the architecturally striking building dating back to the 19th century that is part of the station.

The train arrives on time at East Midlands Parkway, close to Nottingham, where Professor Steve Furber greets me and takes me to his home.

He apologises for the boxes, having recently moved to be closer to family.

We settle in the living room, and I'm served a welcome cup of tea. I tell Professor Furber that it has been 13 years since I last interviewed him.

 

Arm architecture

Furber was a key designer at Acorn Computer, which developed the BBC Microcomputer, an early personal computer that spawned a generation of programmers.

The BBC Micro used a commercially available 8-bit microprocessor, but in 1983-84 Acorn's team, led by Furber, developed a 32-bit RISC architecture.

The decision was bold and had far-reaching consequences: the Acorn RISC Machine, or ARM1, would become the founding processor architecture of Arm.

Nearly 40 years on, firms have shipped over 250 billion Arm-based chips.

 

Cambridge

Furber's interest in electronics began with his love of radio-controlled aircraft.

He wasn't very good at it, and his Physics Master at Manchester Grammar School helped him.

Furber always took a bag with him when flying his model plane, as he often returned with his plane in pieces; he refers to his aircraft as 'radio-affected' rather than radio-controlled.

Furber was gifted at maths and went to Cambridge, where he undertook the undergraduate Mathematical Tripos, followed by the Maths Part III. Decades later, the University of Cambridge recognised Maths Part III as equivalent to a Master's.

"It [choosing to read maths] was very much an exploration," says Furber. "My career decisions have all been opportunistic rather than long-term planned."

At Cambridge, he was influenced by the lectures of British mathematician James Lighthill on biofluid dynamics. This led to Furber's PhD topic, looking at the flight of different animals and insects to see if novel flight motions could benefit jet-engine design.

He continued his love of flight as a student by joining a glider club, but his experience was mixed. When he heard of a fledgling student society building computers, he wondered if he might enjoy using computers for simulated flight rather than actual flying.

"I was one of the first [students] that started building computers," says Furber. "And those computers then started getting used in my research."

The first microprocessor he used was the 8-bit Signetics 2650.

 

Acorn Computers

Furber's involvement in the Cambridge computer society brought him to the attention of Chris Curry and Hermann Hauser, co-founders of Acorn Computers, a pioneering UK desktop company.

Hauser interviewed and recruited Furber in 1977. Furber joined Acorn full-time four years later after completing his research fellowship.

Hauser, who co-founded venture capital firm Amadeus Capital Partners, said Furber was among the smartest people he had met. And having worked in Cambridge, Hauser said he had met a few.

During design meetings, Furber would come out with outstandingly brilliant solutions to complex problems, said Hauser, who led the R&D department at Acorn Computers.

 

BBC Micro and the ARM1

The BBC's charter was to educate the public and broadcast programmes highlighting microprocessors.

The UK broadcaster wanted to educate in detail about what microprocessors might do and was looking for a computer to provide users with a hands-on perspective via TV programmes.

When the BBC spoke to Acorn, it estimated it would need 12,000 machines. Public demand was such that 1.5 million Acorn units were sold.

The computer's success led Acorn to consider its next step. Acorn had already added a second processor to the BBC Micro, and Acorn had expanded its computer portfolio, including the Acorn Cambridge workstation. But by then, microprocessors were moving from 8-bit to 16-bits.

Acorn's R&D group lab-tested leading 16-bit processors but favoured none.

One issue was that the processors would not be interruptible for relatively long periods - when writing to disk storage, for example, yet the BBC Micro used processor interrupts heavily.

A second factor was that memory chips accounted for much of the computer's cost.

"The computer's performance was defined by how much memory bandwidth the processor could access, and those 16-bit processors couldn't use the available bandwidth; they were slower than the memory," says Furber. "And that struck us as just wrong."

While Furber and colleagues were undecided about how to proceed, they began reading academic papers on RISC processors, CPUs designed on principles different to mainstream 16-bit processors.

"To us, designing the processor was a bit of a black art," says Furber. "So the idea that there was this different approach, which made the job much simpler, resonated with us."

Hauser was very keen to do ambitious things, says Furber, so when Acorn colleague, Sophie Wilson, started discussing designing a RISC instruction set, they started work but solely as an exploration.

Furber would turn Wilson's processor instructions and architecture designs into microarchitecture.

"It was sketching an architecture on a piece of paper, going through the instructions that Sophie had specified, and colouring it in for what would happen in each phase," says Furber.

The design was scrapped and started again each time something didn't work or needed a change.

"In this sort of way, that is how the ARM architecture emerged," says Furber.

It took 18 months for the first RISC silicon to arrive and another two years to get the remaining three chips that made up Acorn's Archimedes computer.

The RISC chip worked well, but by then, the IBM PC had emerged as the business computer of choice, confining Acorn to the educational market. This limited Acorn's growth, making it difficult for the company to keep up technologically.

Furber was looking at business plans to move the Arm activity into a separate company.

"None of the numbers worked," he says. "If it were going to be a royalty business, you'd have to sell millions of them, and nobody could imagine selling such numbers."

During this time, a colleague told him how the University of Manchester was looking for an engineering professor. Furber applied and got the position.

Arm was spun out in November 1990, but Furber had become an academic by then.

 St Pancras International station

Asynchronous logic

Unlike most UK computing departments, Manchester originated in building machines. Freddy Williams and Tom Kilburn built the first programmed computer in 1948. Kilburn went on to set up the department.

"The department grew out of engineering; most computing departments grew out of Maths," says Furber.

Furber picked asynchronous chip design as his first topic for research, motivated by a desire to improve energy efficiency. "It was mainly exploring a different way to design chips and seeing where it went," says Furber.

Asynchronous or self-timed circuits use energy only when there's something useful to do. In contrast, clocked circuits burn energy all the time unless they turn their clocks off, a technique that is now increasingly used.

Asynchronous chips also have significant advantages in terms of electromagnetic interference.

"What a clock does on the chip is almost as bad as you can get when it comes to generating electrical interference, locking everything to a particular frequency, and synchronising all the current pulses is exactly that," says Furber.

The result was the Amulet processor series, asynchronous versions of the Arm, which kept Furber and his team occupied during the 1990s and the early 2000s.

In the late 1990s, Arm moved from making hard-core processors to synthesised ones. The issue was that the electronic design automation (EDA) tools did not synthesise asynchronous designs well.

While Furber and his team learnt how to build chips - the Amulet3 processor was a complete asynchronous system-on-chip - the problem shifted to automating the design process. Even now, asynchronous design EDA tools are lagging, he says.

In the early 2000s, Furber's interest turned to neuromorphic computing.

The resulting SpiNNaker chip, the programable building block of Furber's massively parallel neural network, uses asynchronous techniques, as does Intel's Loihi neuromorphic processor.

"There's always been a synergy between neuromorphic and asynchronous," says Furber.

Implementing a massive neural network using specialised hardware has been Furber's main interest for the last 20 years, the subject of the second part of the interview.

For Part 2: Modelling the Human Brain with specialised CPUs, click here

 

Further Information:

The Everything Blueprint: The Microchip Design That Changed The World, by James Ashton. 

Article originally appeared on Gazettabyte (https://www.gazettabyte.com/).
See website for complete article licensing information.