In 1993, Nintendo was a company in an interesting position. While it was undoubtedly a leader in the video game console market, it could no longer boast the virtual monopoly it held during the late Eighties. What’s more, the industry was already planning to transition away from the 16-bit console market, and rival manufacturers were beginning to show their hands. NEC had experienced success in Japan with the PC Engine and had already shown off the 32- bit Tetsujin, while Atari had announced the Jaguar in August 1993 and was gearing up for a holiday test launch. The much-vaunted 3DO, from former Electronic Arts executive Trip Hawkins, was also scheduled to launch for the holiday season and had the backing of electronics giant Panasonic.
Nintendo wasn’t particularly concerned with most of these companies – at the time, Sega was its biggest rival, having been the first company to bring serious competition to the console market. As the two biggest players in the console market, either of them could have been behind what ultimately became the Nintendo 64. The hardware was primarily engineered by Silicon Graphics, Inc, a huge name in movie special effects technology which had recently bought MIPS Technologies, the designer of the CPUs used in its workstations.
Having developed a low-cost, power-efficient version of the latest MIPS processors, SGI put together a design proposal for a games console. In September 1993, the rivals had signed the contracts and made their announcements – Nintendo would partner with SGI and launch its 64-bit home console in late-1995, while Sega would use Hitachi’s 32-bit processors and launch in the autumn of 1994. Sony, Nintendo’s former partner on the SNES CD-ROM project, announced its intention to launch a home console of its own the following month.
Subscribe to Retro Gamer today
For more in-depth features exploring classic games and consoles delivered to your door or digital device, subscribe to Retro Gamer (opens in new tab) today.
Being the last to market wasn’t an unfamiliar situation for Nintendo, as it had done the same with the SNES and been able to retain a substantial market share regardless. The tactic here was the same – simply put, Nintendo bet on having the best technology. Project Reality, as it soon became known, was also an easy machine to hype. With SGI on board, Nintendo Magazine System claimed that the machine had “the potential to provide graphic images such as those seen in Abyss, Jurassic Park and Terminator 2”. At a time when more bits was better, being a 64-bit machine was a big deal. Total emphasised that “[Sega’s] next generation machine, Saturn, is a 32-bit console – fairly powerful, but nowhere near as fast as the Silicon Graphics hardware”.
By the time the console had received its Ultra 64 name in 1994, Nintendo had decided on an extensive advance marketing strategy, working with Midway to create Ultra 64 branded arcade games and taking out advertising to encourage players to wait for the console. They needed plenty of patience, as the Nintendo 64 was delayed repeatedly prior to its Japanese release in June 1996. “It’s hard to do hardware full stop, and this was a totally new platform – new chipset, new CPU, new GPU. On top of that, we were trying to make a flagship Mario game,” says Giles Goddard, a programmer working for Nintendo at that time. “They just wanted to get it right – there was no particular big problem that happened that caused a delay or anything.”
While working on the planned launch game Star Wars: Shadows Of The Empire, Eric Johnston had a privileged position in seeing the system take shape. “I loved the N64 hardware. Mark Blattel and I had a desk at SGI during its development, running it through its paces as it progressed. At the time, the only machine we could simulate it on was a $250,000 SGI Onyx, which was a purple and black box the size of a small desk, which required its own 16-amp power outlet,” Johnston tells us.
Goddard also remembers this setup: “There were changes all the time basically, we rarely saw actual hardware. There were two levels of emulation – the API side emulation where you could recompile your game to run on SGI hardware natively, with very little changes to your code you could run either the native one or build it for the emulator. Most of the time we were developing on the native version of the game, and then occasionally we’d recompile it for the Onyx to see if it still worked in the same way. We rarely saw actual N64 devices.”
The CPU was quite powerful for its day, with a high clock speed of 93.75MHz for a performance of 125 million instructions per second – for comparison, the PlayStation does around 30 MIPS. But did the ability to use 64-bit processing actually provide any practical advantages? “Almost none, I would say,” says Goddard. “I’d say it was more of a marketing thing than anything actually usable. A float is 32 bits and a double float is 64 bits, and you don’t need double floats to do any kind of 3D maths usually, especially back then. All games ran in 32-bit mode. 32 bits is what, 4GB of memory? This thing only had 4MB,” he explains.
“From memory, I think the 64 bits was more marketing spiel than anything else,” concurs Wetrix and Mario Artist: Paint Studio programmer Amir Latif. “It certainly didn’t have a huge amount of RAM to access and the memory bus certainly wasn’t that wide. There was a 32-bit mode and a 64-bit mode, but in reality, we never really touched the 64-bit mode as there were other knock-on effects (for example, pointers become eight bytes instead of four).”
“There were lots of new things that were being thrown at us that we had to familiarise ourselves with,” says Banjo-Kazooie and Banjo-Tooie programmer Chris Sutherland. “We were previously used to coding things in assembly language, so depending on the processor we were using, whether it was for the Game Boy, the NES, or the SNES, we’d be familiarising ourselves with that processor. So I suppose it was a bit of a leap in terms of moving from assembly language to C, where we were programming on a higher-level language. There were lots of things to consider there, and lots of new things to learn,” he explains.
“There was a move to three dimensions as well, which is something that we weren’t familiar with, learning things with cameras and things like that,” Sutherland continues. “We were also using different machines, so previously we would be using PCs to develop with, and now we were using these Silicon Graphics Indys which didn’t run Windows, but they ran a version of a Unix-style operating system.”
A new way
One unique thing about the N64 was the Reality Co-Processor, or the RCP. Although this chip handled the console’s graphical functions, that wasn’t its only task – it was also used for audio and input/output operations. The RCP could be reconfigured towards different performance profiles using custom microcode, and it had plenty of hardware features that were key to the N64’s distinct look. “In particular, I liked having built-in Z-buffer, trilinear mipmapping and floating point. Today, few 3D game developers would even consider going without these, but at the time they were new, and didn’t exist on other platforms at all, even expensive home PCs,” says Johnston. “If you run N64 games side by side with other contemporary platforms, you definitely notice the difference visually.”
Latif also remembers this visual difference: “In terms of pixel fidelity, the N64 had fairly cutting-edge features, especially compared to its PlayStation peer. Z-buffering, antialiasing, bilinear interpolation texturing, perspective corrected texturing, mipmap texturing, environment mapping, fog, all of these features were missing from its competitors. Unfortunately, they also came at a heavy price and the N64 really struggled to throw around too many triangles, especially with some of those heavier effects turned on.”
Rare struggled with the performance balance in its early work with the machine. “There were all kinds of convoluted systems that we had to try and not use that Z-buffer, so it was sorting by object and all this kind of stuff, which always works to 80% but then there’s the 20% where things draw in the wrong order. There were all kinds of things we tried to mitigate that, but in the end we settled on ‘let’s use the Z-buffer,’’ Banjo-Kazooie and Banjo-Tooie artist Ed Bryan recalls. “The Z-buffer was seen as extremely expensive in terms of framerate, but you couldn’t do without it as we found out,” adds artist Steve Mayles. “But this wasn’t on Banjo, this was when we were doing the Dream game,” he says, referring to the precursor to Banjo-Kazooie. “You’d be walking across a bridge and it would all look great, then you’d move the camera slightly and then this massive thing would pop in front of everyone.”
Despite being industry leaders in 3D art thanks to games like Donkey Kong Country and Killer Instinct, Rare’s artists also found that they needed to learn a new way of working for the new console. “With NURBS, the way we did the 3D for Donkey Kong Country, it was completely different to polygons so it was really another world of 3D with the triangles and the vertices,” says Mayles. “There were more rules to follow really, because it was real-time. With the NURBS, you were making it and rendering it out, and at that point it didn’t really matter what it looked like in the package, but with the polygons everything had to be done exactly right, or else it’d go into the game and it’d all go wrong.”
However, the NURBS experience didn’t go to waste, as it was used to create textures – though they were their own challenge. “Colouring everything a vertex at a time, texturing everything a triangle at a time – it was a very different world to what it is now. We opted for as few textures on the characters as possible,” Bryan remembers. “Which actually works well in retrospect,” Sutherland adds, “because if you have that shading, then if you have a modern version which upscales it, it looks kind of neat still, whereas if you have a texture it just gets really blurred out.”
The system’s approach to memory followed a similarly flexible model to the co-processor. Previous consoles had allocated various pools of RAM to different tasks – main memory, video, and audio. The N64 used a unified memory architecture, allowing developers to distribute the system’s 4MB memory between tasks as they saw fit. “Up until that point everybody had to deal with banks and DMA memory between banks, and that was a real pain to do that kind of stuff. Now we had basically everything under one roof, which was just fantastic,” says Goddard.
“You basically had three areas. You had the ROM, you had the RAM, and then you had the graphics memory – and when I say memory I mean texture memory and vertex memory,” Goddard continues. “So you still had a graphics part of the memory that was separate – it was on-chip cache, but it was great to have everything in RAM – you could access anything, anywhere, without having to worry about what area it was. That was one of the big attractions for that kind of architecture.”
Although the N64’s memory architecture was not especially fast, Goddard doesn’t recall this being an issue. “It was more the size of the caches that was the problem, they were quite small. It was 4K for the textures and I think something really stupid like 16 vertices. That’s where triangle stripping and all these sort of clever ways of getting the most amount of triangles out of fewer vertices was really important.” Another unusual aspect of the RAM was the ninth bit reserved for graphical functions – something Johnston was keen to exploit in other ways.
“You might know that the original in-development architecture was only 2 or 2.5MB RAM, all 9-bit DRAM. The CPU only had access to it as eight bits per byte, so I wrote a sketchy driver to (at some hiccup cost) use the ninth bit as extra memory. I mean hey, that’s an extra 280K or so, minus what the frame buffers need – enough for some cached textures or sounds,” Johnston explains. “I proudly showed it off to Acorn, a super-cool ace developer at Nintendo. Some time later, after they’d upped the memory to 4MB I got an email saying you’re welcome and please don’t use the ninth-bit hack job in a shipping game.” Though Johnston’s hack never saw the light of day, it was possibly for the best. “The RCP made really nice use of the ninth bit though, for extra Z-buffer resolution and 5553 RGB+ coverage for their clever and imperfect anti-aliasing, in a world where supersampling wasn’t an option,” he tells us.
The history of Animal Crossing: How the series evolved from an N64 oddity to a Nintendo Switch system seller
When it came to sound, the N64 used the CPU and RCP to play back sound samples. While it offered a major jump over what was possible on the SNES, it had some major disadvantages compared to its competitors. “N64 has more RAM available for the audio processor, and it can load data much faster from ROM than a PlayStation can from CD,” explains veteran video game musician Matt Furniss. “Most PlayStation games had CD soundtracks leaving all the available audio RAM for sound effects, whereas N64 had to generate both music and sound effects. So in the end PlayStation sounds better – more sound effects, higher sample rates. But N64 music could be more dynamic and seamlessly change during gameplay.”
The flexibility of the system allowed for a variety of approaches. “Cruis’n Exotica uses very large single-channel samples, compressed down from the original arcade game. For Excitebike 64 we used dual-channel audio stems which would allow a little more variety in each song,” Furniss explains. However, he was rather fortunate to be able to do so. “Both games I worked on had large cartridge ROMs. Enough space to store all the music and effects at a sample rate and compression which sounded decent.” When asked how much space that took, he draws a blank, but tells us “it must have been more than most games, it was unusual to handle the music as we did”.
Even then, large samples could only take you so far – for example, Tony Hawk’s Pro Skater 2 on the N64 has a reduced selection of songs, each of which are formed of long repeating samples. More commonly, developers would construct music from short samples of instruments, as was done on the SNES – Resident Evil 2 does this. But the use of cartridges was a problem for more than sound.
The CD-ROMs adopted by Sony and Sega had the downside of slow loading and being easier to copy, but allowed for plenty of presentational fluff such as FMV sequences and extensive voice acting. Some developers, most notably Squaresoft, found that sticking with ROM cartridges simply represented too great a constraint on their ambitions, and moved to rival platforms. Other publishers were attracted by the low manufacturing cost of CDs, allowing them to manufacture games at less financial risk, as well as potentially offering them at a lower price.
The past and the future
The top 25 best N64 games of all time, from Wave Race to Zelda
Beyond the console’s internals, innovation was extended to the interface of the system. The console came with four control ports as standard, making larger multiplayer games the norm – a simple change that made classics of games like GoldenEye 007 and Mario Kart 64. More radical was its unusual, three-pronged controller. The central analogue thumbstick provided fine control over direction and speed of movement, while the quartet of C-buttons was designed for 3D camera control and the Z trigger provided a substitute for L/R depending on your grip.
“Having had the privilege to work very closely with key Nintendo leaders, I got to learn from them and understand their focus on playful and surprising interactions and then the functional and simple way the hardware enables it,” says Diddy Kong Racing and Dinosaur Planet director Lee Schuneman. “It’s a controller designed with both an eye to the future (3D) and connection to the past (2D) with an understanding that players and developers need time to get used to the change that was coming as 3D worlds became the norm. The reality is that the concepts from it remain to this day in all controllers. I always like that Nintendo walk their own path and design hardware to enable the games, not the other way around. It’s never tech for tech’s sake.”
Of course, for all of its technical grunt, the N64 was defined as much by the talent of those developing for it as the system specifications. The special effects of Super Mario 64 that made the console look a step ahead of anything else were as much a showcase of ingenuity as technology. “I think a lot of the stuff we were doing was made to highlight the hardware – it wouldn’t look the same on a PlayStation,” says Goddard. “It was Nintendo, so they obviously had a lot of know-how, a lot of willingness to experiment with ideas without having to worry about going over budget too much. It was a bit of both – the artists at Nintendo are amazing, the programmers are amazing. It was the combination of having great hardware and a great team.”
“I have never approached any game development thinking about the hardware, it’s always what’s the idea and let’s do everything we can to make it real,” says Schuneman. “Of course along the way you discover things that maybe you can or can’t do but then you find a solution around it! Rare was (and I’m sure still is) full of great software engineers who were never satisfied with any limitation, so hardware weaknesses were never a problem and just something to work around.” Mostly, he remembers the people over the hardware. “The amount of world-class game designers (Miyamoto, Iwata, Ken Lobb, all the Rare founders) that I got to interact with over those N64 years was pretty amazing in hindsight, and even a prerelease Ocarina Of Time to learn from.”
As the N64 aged, there were a couple of attempts at expanding its capabilities. The 64DD was a disk drive that used proprietary magnetic disks with a 64MB capacity and some capability to save data. This was actually first shown to the public at the Shoshinkai show in 1996, but was heavily delayed, with little information revealed to the public. According to Latif, who was working on Mario Artist: Paint Studio at Software Creations, it wasn’t just the public that were left in the dark.
“I actually left the project to help start ZedTwo and work on Wetrix before Mario Artist was finished,” Latif explains. “That project brings back a lot of mixed memories – it just went on for so long, and at various times it didn’t feel like it was ever going to come out. During my time on the project, some three to four years, we never even saw prototype 64DD devkits.” The device finally arrived in Japan in December 1999, and received very little support, with Doshin The Giant, F-Zero X Expansion Kit and SimCity 64 being its most notable games.
However, the 64DD came bundled with something that wound up being far more important – the Expansion Pak. This plug-in module doubled the RAM of the console, and was supported by dozens of cartridge games. Most games used this to offer high-resolution modes, but some such as San Francisco Rush 2049 included exclusive gameplay content such as extra stages. The most ambitious three were Donkey Kong 64, Perfect Dark and The Legend Of Zelda: Majora’s Mask, which all required the Expansion Pak.
Artist Mark Stevenson remembers it being beneficial in terms of standard things like level size in Donkey Kong 64, but there were also more creative uses. “One thing I remember that we did use it for was that we had a lot of dynamic lighting in there, which was hard to do and expensive,” he recalls. “One of the engineers wrote a system whereby you’d go into a cave area, and there’d be a swinging light – the first swing of that light, it’d record all of the colour changes on all of the vertices in that area, and then save it as data and just play it back as an animation rather than going on to calculate the lighting constantly. You’d get a little bit of slowdown when you went in, but after that, it was nice and smooth.”
Even with the memory boost, developers did eventually find the system’s limits – something you can see in the leaked demo of Rare’s unreleased game Dinosaur Planet. “I think we were running at 15fps most of the time so clearly had pushed it too far! But as with Diddy Kong Racing (and much of the Dinosaur Planet team were from that team also) we just wanted to realise our vision and screw the technical limitations,” says Schuneman.
“I had a great moment with Dinosaur Planet when I demoed the game on a giant projection screen at Rare with ex-Nintendo Of America president Arakawa-san and it’s like this big cinematic-style game coming out of an N64… lots of applause and a happy moment for the team.” The game ultimately received a new direction and was redirected towards the N64’s forthcoming successor. “Star Fox Adventures happened,” Schuneman adds, “which was both a blessing and a curse but out of that transition a few of us (myself, Kevin Bayliss and Phil Tossell) at least got to go work with Miyamoto-san and Iwata-san in Kyoto.”
Other games made similar leaps to the GameCube, including Capcom’s Resident Evil Zero and Silicon Knights’ Eternal Darkness: Sanity’s Requiem. Although the Nintendo 64 was a powerful console with many advanced features, it was unable to repeat the success of the SNES, which did eventually become the best-selling console of its generation. The N64 sold fewer units than its 16-bit predecessor, and Nintendo fell behind Sony to become a distant runner-up in the global home console market.
The N64 struggled terribly in Nintendo’s traditional stronghold of Japan, where the console’s relative lack of RPGs was a real problem, and it even wound up selling fewer units than the Sega Saturn. It also has fewer software releases than either of its competitors – just under 400, compared to over 1,000 for the Saturn and over 4,000 on the PlayStation. While it should be noted that Nintendo remained profitable throughout the N64 years, judged by these measures the console does not look like a success.
But it’s impossible to deny the legacy of Nintendo’s console. For a start, it was influential at a hardware-design level. As Schuneman pointed out, every console manufacturer eventually borrowed bits of the N64 controller, even if its distinctive shape wasn’t one of them, and four controller ports became standard until wireless connectivity made them redundant. What’s more, it’s arguable that the N64 did more than any of its rivals to advance 3D gaming. It was a small but significant step forward graphically – when compared to the blocky textures and wobbly walls of PlayStation and Saturn games, N64 games generally look more solid and stable.
But more than that, the hardware arrived at a time when developers were still working out how to design 3D games, and the reason that the N64’s hit list is so familiar is because so many of its games provided a template for the rest of the industry to follow. It’s certainly
telling that Nintendo didn’t radically alter its designs for Mario and Zelda on the GameCube. Twenty-five years on, that’s perhaps the best way to contextualise the N64’s place in history. It’s a piece of hardware that was designed by experts in 3D, who didn’t just care about having it as a selling point, but making it look better than anyone else did. It ran games that elevated the standards that players expected of 3D games, from control schemes to inventive stage designs. Although it wasn’t the most popular platform of its day, the N64 was the console that confidently signposted our way into the 3D future.
This feature first appeared in issue 224 of Retro Gamer magazine. For more excellent in-depth features like this, you can pick up an issue or subscribe today by heading on over to Magazines Direct. (opens in new tab)