Surround computing is apparently an “extension of pervasive and ambient computing trends” no less. It is when computing technologies are “a completely natural and seamless part of daily life”. Papermaster is proof of it, we can't see any seams on him at all, but that might be the idea.
Talking at the Hot Chips conference, Papermaster said that surround computing imagines a world without keyboards or mice, where natural user interfaces based on voice and facial recognition redefine the PC experience, and where the cloud and clients collaborate to synthesise exabytes of image and natural language data. He said that the ultimate goal is to create devices that deliver intelligent, relevant, contextual insight and value that improves consumers’ everyday life in real time through a variety of futuristic applications.
Papermaster claimed that AMD was leading the quest for devices that understand and anticipate users’ needs, are driven by natural user interfaces, and that disappear seamlessly into the background. In other words if you want to know where Papermaster's seams are you should look in the background. He said that the glorious new era will rely on robust “plug-and-play” IP portfolios including central processing units (CPUs), graphics processing units (GPUs), fixed function logic, and interconnect fabric.
He also unveiled key details of AMD’s upcoming “Steamroller” CPU architecture while underscoring the benefits of the industry-standard Heterogeneous Systems Architecture (HSA) that enables software developers to easily assign scalar and parallel compute workloads to the most appropriate compute units, and therefore optimise power.
He said that the road that leads AMD to the Surround Computing Era will be no less challenging and every bit as exciting as the 20-year journey in graphics processing that brought gamers from ‘Pong’ to today’s modern game titles that feature stunning visual realism.
“It will take an industry movement to complete this journey, and HSA provides the clear path forward to enable this next generation in computing,” he said.