"When you turn on the Revolution... and see the graphics... you will say Wow!.. ATI is assuring us of this.." -Iwata @ E3 2005 Press conference
Believe it or not THIS IS NOT FMV. This is real time footage and real time cutscenes integrated using the IN GAME ENGINE. Amazing. I said Wow and I'm sure you did too.
Tuesday, October 2, 2007
Monday, September 10, 2007
Well what can I say?...
http://gonintendo.com/?p=24711
"On publishers not taking advantage of the Wii’s power for graphics: “the traditional, more photorealistic route, because there you really have to push it, and they’re really not pushing it. Why not? Hmmm I don’t know, the hardware is very, very easy to understand. Now the problem might be –and it just might be- is that some studios -or some publishers specifically- are discarding the graphical capabilities automatically simply because it is a Wii title and they’re basically telling the developers “look, we won’t pay for any advanced graphics”.”-
On Wii’s “shaders” and possibilities: “If you connect you can get a lot of shader effects which would’ve been on the 360 or the PS3.” (…)”it’s got so much more power compared to the GameCube. If even with the extremely similar shader hardware, the system clockrate is so much higher, you can do so much more advanced things” (…)”in the photorealistic route there’re certain things which the basic structure of the graphics hardware was not meant for and which you have to find really clever tricks to basically make up.”"-Factor 5
Read the entire interview to get the full story but basically it proves everything the good doctor has been telling you for the past two years.
- The Wii is no GC 1.5
- It's far more powerful than the xbox
- It's powerful enough for "360-like" visuals and effects but on a smaller scale
- Bad graphics on Wii are due to less funding, laziness, etc.
The Wii is plenty powerful. Just not as powerful as the other two. It's in the middle performance wise.
Oh yeah...
PWNED! [face_mischief]
Take that drtre haters.
Saturday, April 28, 2007
Why haven't we said Wow yet?
Iwata said..."When you turn on the Revolution and see the graphics... you will say Wow. ATI is assuring us of this." Now this statement was made at a time when the only games around were internally developed by Nintendo. No 3rd party developers even had development kits then. This comment was made at E3 2005. ATI was telling them that gamers would say wow based on hollywood benchmarks tests, not based on the crappy ps2 to Wii ports we are getting now.
A game that originated on final Wii hardware and has a good 2 years development time probably would make us say wow. Especially considering the Wii's size. Problem is... we won't be seeing games like this until around 2008 from 3rd parties since most of them didn't reveive their dev kits until last may or june. Games like Super Mario Galaxy have already made some gamers say wow at the graphics and that game originated on GC hardware. It is also smart to note that traditionally it has been 3rd party devs that have pushed the graphics of Nintendo systems close to their limits. Not Nintendo. Super mario sunshine was a great looking game but is not considered the best looking GC game by anyone(even though it had hands down the best looking water of any game last gen). There are going to be 3rd party Wii games that spank Galaxy graphically and considering how good that game already looks... that should be exciting. Because of these crappy ports we have been getting the improvement will actually be more exciting.
Late 2008 and on Wii games probably won't even look like there are on the same system as the crap we are seeing now. It will probably be like comparing the character models of Eternal Darkness, which like 98% of Wii games, started it's development on a much weaker console, to the models in RE Remake. There is going to be a significant boots in graphics on the Wii from what we are seeing now but the above average 360 game will always look better.
A game that originated on final Wii hardware and has a good 2 years development time probably would make us say wow. Especially considering the Wii's size. Problem is... we won't be seeing games like this until around 2008 from 3rd parties since most of them didn't reveive their dev kits until last may or june. Games like Super Mario Galaxy have already made some gamers say wow at the graphics and that game originated on GC hardware. It is also smart to note that traditionally it has been 3rd party devs that have pushed the graphics of Nintendo systems close to their limits. Not Nintendo. Super mario sunshine was a great looking game but is not considered the best looking GC game by anyone(even though it had hands down the best looking water of any game last gen). There are going to be 3rd party Wii games that spank Galaxy graphically and considering how good that game already looks... that should be exciting. Because of these crappy ports we have been getting the improvement will actually be more exciting.
Late 2008 and on Wii games probably won't even look like there are on the same system as the crap we are seeing now. It will probably be like comparing the character models of Eternal Darkness, which like 98% of Wii games, started it's development on a much weaker console, to the models in RE Remake. There is going to be a significant boots in graphics on the Wii from what we are seeing now but the above average 360 game will always look better.
Wednesday, April 25, 2007
The Excite Truck 2 Rumor is not a Hoax
As you know leaked info about Exite truck 2 has been surfacing the net and then after a statement from Nintendo of Europe all of a sudden it's supposed to be a hoax? LOL...
You guys don't know how Nintendo works by now? They didn't even deny it was coming. Read the quote again...
"It seems to have stemmed from an unofficial source. We have not announced any sequel in the Excite Truck series."
Ok. Of course Nintendo didn't announce it... it was leaked! Also notice that they say "excite Truck Series" You don't call something a series unless there is more than one. Use your head people. The game is coming and this story was legit. You heard it here first.
You guys don't know how Nintendo works by now? They didn't even deny it was coming. Read the quote again...
"It seems to have stemmed from an unofficial source. We have not announced any sequel in the Excite Truck series."
Ok. Of course Nintendo didn't announce it... it was leaked! Also notice that they say "excite Truck Series" You don't call something a series unless there is more than one. Use your head people. The game is coming and this story was legit. You heard it here first.
Monday, April 9, 2007
INSIDE THE WII
The following is an interview with an industry insider who had hands on with Wii sdk 2.1. I will NOT be posting the name of this source for obvious reasons so you can believe me or not believe what it said here is true. It's entirely up to you. With that said. Enjoy.
What benchmark tests have you seen exactly?
PPC Benchmarks for the Wii Cpu under Linux compared to the X360 and PS3 Cpu's... Infact I find it kinda funny that the Wii CPU is almost Equal to a Single Core of the the X360... Also the Wii GPU has about 1/3 the Cache of the X360... See a Trend?!? 1/3 seems to be a Magic Number when comparing Wii and X360... Don't take this to mean the x360 is 3 Times the Power of the Wii, as it's Not... The X360 is roughly 2 - 2.5 the Speed of the Wii overall... The PS3 is roughly the same as the x360 Benchmark wise, however It gets Insane Scores at Floating point Math due to the SPE units... Overall to compare the X360 CPU and PS3 CPU to a Computer I'd say roughly a 3.2Ghz Intel P4 (Non-DualCore)... The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range...GPU benchmarks are being worked on but as so little is really known for the 3 GPU's in question I doubt the numbers will be accurate enough to be useful...Basically the PS3 is the Most Powerful but only if you program for the SPE's and that is Rather Hard...The X360 is Second, but once again you need to program for a 3 Core CPU (granted it's easier then Sony)...Finally the Wii, It's simple Single Core design with High Memory bandwidth and just enough ram to keeps things running smoothly... Over all a great machine, but just not what some people wanted...I feel the Wii is what is know as a "Lead-In Product"... That means it's breaks new ground in a Radical Fashion but can still fall back to a normal if needed (ie: the WiiMote is not required for all Games)... This is why there was no HUGE Leap in other Hardware... If the Wii had Failed, The loss due to Hardware heavy machine could have been very bad for Nintendo... However, now that we can see the Market is accepting the WiiMote, the Next Nintendo Console will include a Revised Version if it and also the Beefed Up CPU and GPU the others have wanted... The DS is a Lead-In Product, we have yet to see what Nintendo has learned from it though... Of course not all products of this Type do well (ie: VirtualBoy)...
Tell me about the dev kit…
Basically it had plenty of Hardware info like the Memory Layouts for the GPU... We already knew that the 3 Megs was split into 2 Sections (Frame Buffer 2 Megs and Texture Cache 1Meg)Gpu and CPU can Copy it's contents to the 24 Megs Ram and Vice Versa, allowing for extended Graphics RAM or Texture Caching (and people wondered why 24 Megs of 1T-SRAM was On Chip)...Wii Dev kits have 128Megs instead of 64Megs of GDDR3 as Main Ram...Interesting Note: Inorder to make certain games Ran Smoothly in Finally Revision GC Dev Kits ran a Slower MHz then the Retail GC... 350~ compared to 485MHz...Even more Interesting : Mario Galaxy "Live Demo" that was Shown at E3 was run on a Modified GameCube, NOT Revolution Hardware at all... Expect something BIG from this Title...(Editor Note: this interview was done before the newer Galaxy trailer was shown)The "Re-Birth" Demo is given in here as example Code and it is ONLY 32 Megs in Size for the REAL TIME Rendered Version...
So why do wii dev kits have 128MB's instead of 64?
It's breathing Room... It gives them room to make some pretty sloppy code and test it before trimming it down to fit the 64Megs of the Retail Unit... Uncompressed textures and Sound could also be used in testing...The GC Kits where very restrictive as they had less power then the retail, Nintendo fixed that this time around...
I read somewhere than the 750 CL could go as high as 1.1 ghz... so what is broadway clocked at?
Software Clocked at 729 MHz via the Bios.
So what can you tell me about the TEV unit in the Wii? Does it have any addition pipelines? Vertex shader support?
TEV is Basicly the Same... The GPU has twice the Pipelines now at 8... Tev makes up for this by allowing 16 Texturing Stages per Pipeline... Vertex Shader routines are handled by Tev just like Pixel Shader routines are...
Do you know for a fact that it(hollywood) now has 8 pipelines or are you going by the #@#%nintendo interview?
FACT, it's listed in the SDK.
Any other Hollywood GPU tidbits you can tell me?
GC GPU to Wii GPU
162 MHz Clk to.... 243 MHz Clk ( 50 % Increase in Clk Speed )
3 Megs embedded Ram Same ( Wii able to use A-Ram as Additional
GPU/CPU Ram )
18-Bit color Used to avoid Frame Drops to.... 32-Bit Color Used at all times
4 Pixel Pipelines to.... 8 Pixel Pipelines
4 Texture Pipe (16 Stages Each) to.... 8 Texture Pipelines (16 Stages Each)
Resolution is Restricted in the SDK not in the Hardware...
Those are the Basics...
You can see where this would be 3-5 Times more powerful then the Normal GC GPU..
Cool. So how fast is the 64MB of GDDR in the Wii and has the 24MB of 1t-sram seen an increase in speed as well?
That's interesting in itself... Some GameCube programs rely on the Clock to a a Certain Ratio... The Gamecube CPU is 3 times the Clock of the GPU... GPU Clk is = to the Front Side Bus... Memory is Clk 2 times the GPU Speed...486 CPU 364 RAM 162 GPUThe Wii would be as Follows...729 CPU 486 RAM 243 GPU (ARAM would be at 243 also)As you can tell it's a Formula that they use and it seems to work well
So what can you tell me about the TEV unit in the Wii? Does is it dramatically different from the TEV in the GC? Vertex shader support?
TEV is Basicly the Same... The GPU has twice the Pipelines now at 8... Tev makes up for this by allowing 16 Texturing Stages per Pipeline... Vectex Shader routines are handled by Tev just like Pixel Shader routines are.
So does the file disclose any of the GPU's performance numbers?Gflops? Polygons per second? Fill rate? Stuff like that?
Only in a round about way... It gives you test code to run and shows some optimizations that can be added...The Code is run via Debuggers Cable or Wireless Network (Wireless requires a Special Disc in the Wii)...
But you can do the math. 8 pipelines at 243mhz… that’s about 1944 megapixels per second. That’s a lot higher than the GC could handle and around twice what the xbox was dishing out.
Hmmm.. this must have been what Julian from Factor 5 meant when he said the Wii had an insane filtrate.. well compared to the Gamecube.
Exactly.
What benchmark tests have you seen exactly?
PPC Benchmarks for the Wii Cpu under Linux compared to the X360 and PS3 Cpu's... Infact I find it kinda funny that the Wii CPU is almost Equal to a Single Core of the the X360... Also the Wii GPU has about 1/3 the Cache of the X360... See a Trend?!? 1/3 seems to be a Magic Number when comparing Wii and X360... Don't take this to mean the x360 is 3 Times the Power of the Wii, as it's Not... The X360 is roughly 2 - 2.5 the Speed of the Wii overall... The PS3 is roughly the same as the x360 Benchmark wise, however It gets Insane Scores at Floating point Math due to the SPE units... Overall to compare the X360 CPU and PS3 CPU to a Computer I'd say roughly a 3.2Ghz Intel P4 (Non-DualCore)... The Wii is some where in the 2.4 - 2.6Ghz Intel P4 (Non-DualCore) Range...GPU benchmarks are being worked on but as so little is really known for the 3 GPU's in question I doubt the numbers will be accurate enough to be useful...Basically the PS3 is the Most Powerful but only if you program for the SPE's and that is Rather Hard...The X360 is Second, but once again you need to program for a 3 Core CPU (granted it's easier then Sony)...Finally the Wii, It's simple Single Core design with High Memory bandwidth and just enough ram to keeps things running smoothly... Over all a great machine, but just not what some people wanted...I feel the Wii is what is know as a "Lead-In Product"... That means it's breaks new ground in a Radical Fashion but can still fall back to a normal if needed (ie: the WiiMote is not required for all Games)... This is why there was no HUGE Leap in other Hardware... If the Wii had Failed, The loss due to Hardware heavy machine could have been very bad for Nintendo... However, now that we can see the Market is accepting the WiiMote, the Next Nintendo Console will include a Revised Version if it and also the Beefed Up CPU and GPU the others have wanted... The DS is a Lead-In Product, we have yet to see what Nintendo has learned from it though... Of course not all products of this Type do well (ie: VirtualBoy)...
Tell me about the dev kit…
Basically it had plenty of Hardware info like the Memory Layouts for the GPU... We already knew that the 3 Megs was split into 2 Sections (Frame Buffer 2 Megs and Texture Cache 1Meg)Gpu and CPU can Copy it's contents to the 24 Megs Ram and Vice Versa, allowing for extended Graphics RAM or Texture Caching (and people wondered why 24 Megs of 1T-SRAM was On Chip)...Wii Dev kits have 128Megs instead of 64Megs of GDDR3 as Main Ram...Interesting Note: Inorder to make certain games Ran Smoothly in Finally Revision GC Dev Kits ran a Slower MHz then the Retail GC... 350~ compared to 485MHz...Even more Interesting : Mario Galaxy "Live Demo" that was Shown at E3 was run on a Modified GameCube, NOT Revolution Hardware at all... Expect something BIG from this Title...(Editor Note: this interview was done before the newer Galaxy trailer was shown)The "Re-Birth" Demo is given in here as example Code and it is ONLY 32 Megs in Size for the REAL TIME Rendered Version...
So why do wii dev kits have 128MB's instead of 64?
It's breathing Room... It gives them room to make some pretty sloppy code and test it before trimming it down to fit the 64Megs of the Retail Unit... Uncompressed textures and Sound could also be used in testing...The GC Kits where very restrictive as they had less power then the retail, Nintendo fixed that this time around...
I read somewhere than the 750 CL could go as high as 1.1 ghz... so what is broadway clocked at?
Software Clocked at 729 MHz via the Bios.
So what can you tell me about the TEV unit in the Wii? Does it have any addition pipelines? Vertex shader support?
TEV is Basicly the Same... The GPU has twice the Pipelines now at 8... Tev makes up for this by allowing 16 Texturing Stages per Pipeline... Vertex Shader routines are handled by Tev just like Pixel Shader routines are...
Do you know for a fact that it(hollywood) now has 8 pipelines or are you going by the #@#%nintendo interview?
FACT, it's listed in the SDK.
Any other Hollywood GPU tidbits you can tell me?
GC GPU to Wii GPU
162 MHz Clk to.... 243 MHz Clk ( 50 % Increase in Clk Speed )
3 Megs embedded Ram Same ( Wii able to use A-Ram as Additional
GPU/CPU Ram )
18-Bit color Used to avoid Frame Drops to.... 32-Bit Color Used at all times
4 Pixel Pipelines to.... 8 Pixel Pipelines
4 Texture Pipe (16 Stages Each) to.... 8 Texture Pipelines (16 Stages Each)
Resolution is Restricted in the SDK not in the Hardware...
Those are the Basics...
You can see where this would be 3-5 Times more powerful then the Normal GC GPU..
Cool. So how fast is the 64MB of GDDR in the Wii and has the 24MB of 1t-sram seen an increase in speed as well?
That's interesting in itself... Some GameCube programs rely on the Clock to a a Certain Ratio... The Gamecube CPU is 3 times the Clock of the GPU... GPU Clk is = to the Front Side Bus... Memory is Clk 2 times the GPU Speed...486 CPU 364 RAM 162 GPUThe Wii would be as Follows...729 CPU 486 RAM 243 GPU (ARAM would be at 243 also)As you can tell it's a Formula that they use and it seems to work well
So what can you tell me about the TEV unit in the Wii? Does is it dramatically different from the TEV in the GC? Vertex shader support?
TEV is Basicly the Same... The GPU has twice the Pipelines now at 8... Tev makes up for this by allowing 16 Texturing Stages per Pipeline... Vectex Shader routines are handled by Tev just like Pixel Shader routines are.
So does the file disclose any of the GPU's performance numbers?Gflops? Polygons per second? Fill rate? Stuff like that?
Only in a round about way... It gives you test code to run and shows some optimizations that can be added...The Code is run via Debuggers Cable or Wireless Network (Wireless requires a Special Disc in the Wii)...
But you can do the math. 8 pipelines at 243mhz… that’s about 1944 megapixels per second. That’s a lot higher than the GC could handle and around twice what the xbox was dishing out.
Hmmm.. this must have been what Julian from Factor 5 meant when he said the Wii had an insane filtrate.. well compared to the Gamecube.
Exactly.
Subscribe to:
Posts (Atom)