Video game hardware. SGI, silicon graphics machines are based on the geometry engine invented -- developed by Jim Clark, I think probably when he was still at Stanford in the early 80s.
At a certain point, everybody figures out there's nothing new in line drawing algorithms.
Typically the flow goes like this:
There is a seminal paper, which sets the tone for some topic.
That first ray-tracing paper.
That first inverse kinematics paper.
That first automated behavior paper.
Corporate university research is applied to this.
This becomes a specialty.
And the Ph.Ds and Master's students start marching through Siggraph and other conferences with papers on this topic.
Cornell, for instance, has been focusing on radiocity and other types of rendering for a long time. It gets incorporated into off the shelf software.
This is it.So it can be incorporated efficiently into off the shelf software. And then from there, it's frequently implemented in hardware, where you can capitalize on performance. It's only implemented in hardware when you know some new breakthrough in line drawing algorithms are not going to happen.
the information, the technology transfer is the students who come out of these schools. Software to hardware, or hardware to software.Alan Turing in a 1937 paper inventing the concept of software, basically. The idea that you could represent anything with numbers, including instructions, which could be used to change the operation of the machine. Which is why we don't have to have a different machine to switch between word processing and spread sheets and graphics.
Eniak, the first digital computer was operated primarily through throwing switches.
Previous to that, you would have to actually modify the hardware in order to get it doing something different. General purpose computers were used in the early phases. All the interesting stuff was happening in software.
Then , people started using hardware design, custom hardware design to do interesting things.
The Amiga Digital Video Audio Systems. All this kind of stuff. The real-time Kodak compression decompression boards that we have now.
For instance, if you had a Mac 840 AV, you were doing digital audio and digital video in hardware. Okay? Today if you buy, say a power Mac 8500, that stuff is being done in software, because the CPU is faster and it has time to do that at the same speed as the former hardware did.
What does the future bring?
Probably some kind of combination.
Probably everything will be emulated.
But there's still a role for custom hardware.
The near future, from what I can tell,
looks like this :
We started out with hardware. We went to software.
And in terms of where the real juice was, we're going toward the document. Things like
Acrobat, Adobe Acrobat . There's a lot of research being done on creative documents
where the document holds audio, video, graphics. Maybe databases. All kinds of things.
So it's centering on the document. And then you'll use a variety of editors to edit the information in the document.
< BACK | NEXT>