-

Jump to content



Sign up for a free account!

Signing up for an account is fast and free. As a member you can join in the conversation, enter contests and you won't get the popup ads that guests get. Click here to create your free account.

Photo

Bits, bits, and more bits


This topic has been archived. This means that you cannot reply to this topic.
11 replies to this topic

#1 of 12 Jason Handy

Jason Handy

    Second Unit

  • 383 posts
  • Join Date: Oct 03 2001

Posted November 13 2001 - 07:41 AM

Can I ask a stupid question? Well, too late for that, but I have been in thoughtful repose about the fancy specification attached to all gaming systems - the number of bits.

Someone, please tell me how a 32-bit processor like the one found in X-Box can outperform a console from 2 years ago (Dreamcast) that claims to be 128-bit graphics. I must be missing some point in the middle, but it seems like the number of bits is a hokey parameter that is meant to impress the J6P in a game store.

Please put your flamethrowers down...I am unarmed... Posted Image

Jason
-------------------------
Enjoy Coke

#2 of 12 BrianB

BrianB

    Producer

  • 5,211 posts
  • Join Date: Apr 29 2000

Posted November 13 2001 - 07:47 AM

Quote:
. I must be missing some point in the middle, but it seems like the number of bits is a hokey parameter that is meant to impress the J6P in a game store.

Exactly Posted Image
high resolution ipod featuring dlp hd programming is the best, almost as good as playstation 2 with wega windows media on a super cd! ps2 and tivo do dolby tv with broadband hdtv!

#3 of 12 Michael St. Clair

Michael St. Clair

    Producer

  • 6,009 posts
  • Join Date: May 03 1999

Posted November 13 2001 - 07:49 AM

Jason,

Yeah, you've pretty much got it.

#4 of 12 Rob Robinson

Rob Robinson

    Second Unit

  • 295 posts
  • Join Date: Aug 08 2001

Posted November 13 2001 - 08:51 AM

Once polygons came into the mix, "bit" became a pretty worthless term, console wise;

pretty much the only time it carried much weight as far as the market goes was the difference between 8 bit and 16 bit systems.

#5 of 12 Richard Knight

Richard Knight

    Agent

  • 27 posts
  • Join Date: Nov 04 2001

Posted November 13 2001 - 02:43 PM

Bits are a worthless technology term - now, most processors and motherboards involve so many different bit-widths that comparison is impossible.

Since game systems all use different types of processors, its quite possible for an Intel 733MHz Pentium III (X-Box?) to lose out to a 200MHz Hitachi RISC (Dreamcast), a 485MHz PowerPC (Gamecube), or a 300MHz RISC variant (PS2).

Intel's x86 architecture is more than 20 years old, remember.

FWIW, keep in mind that Intel's SSE extended instruction set is technically "128-bit".

Right now, the closest thing to performance measurement would be fillrate, but outside of the PC world (where there is a common platform), its extremely hard to find out anything one way or the other: for example, 64MB of DDR RAM sounds fast, but its a lot slower than 24MB of SRAM, and back and forth and so-on. Posted Image

Richard

#6 of 12 Dave Rhodes

Dave Rhodes

    Auditioning

  • 6 posts
  • Join Date: Nov 12 2001

Posted November 13 2001 - 11:20 PM

You also got to remember that the bits of the CPU can be misleading. The floowing systems are all 32-Bit, but vary in terms of power: the 3D0, Game Boy Advance, PlayStation, Saturn, X-Box, FM Towns Marty, Gamecube, Jaguar, and Virtual Boy. They're all 32-Bit, but it's easy to see that the 3D0 is much weaker than the Gamecube, for example.

Oh, and for the record, the Dreamcast is 64-Bit, not 128-Bit. The only current system on the market that is 128-Bit is the PlayStation 2.

------------------
Dave

#7 of 12 Mike__D

Mike__D

    Supporting Actor

  • 617 posts
  • Join Date: Dec 27 2000

Posted November 14 2001 - 01:18 AM

"Oh, and for the record, the Dreamcast is 64-Bit, not 128-Bit. The only current system on the market that is 128-Bit is the PlayStation 2. "

The Dreamcast, when refering to the graphics chip, which is what most of the marketing refers to, is 128bit, not 64bit. Dreamcast used the NEC PowerVR chip, which is 128bit.

The Xbox uses a variation of the Geforce graphics chip, and it's 256bit. The CPU on the other hand, is a 32bit processor. Speaking of CPU speeds, on can not compare each CPU Mhz to Mhz since they process instructions differently. For example, as most folks might not realize, a Pentium 3 1Ghz will outperform a Pentium 4 1Ghz (one doesn't exist though).... they are not the same chip. Which is one of the reasons Intel makes them avaible at higher clock speeds then the Pentium 3. Did I lose anyone? Posted Image

I can not comment on the PS2 or Gamecube.



#8 of 12 Jason Handy

Jason Handy

    Second Unit

  • 383 posts
  • Join Date: Oct 03 2001

Posted November 14 2001 - 01:46 AM

Mike_D,
OK, I am with you on this one. But I am still unclear what the "bit" actually defines. Is it the amount of information per pixel (specifying the color, maybe the texture, etc.), or does it define something else.

Jason
-------------------------
Enjoy Coke

#9 of 12 Mike__D

Mike__D

    Supporting Actor

  • 617 posts
  • Join Date: Dec 27 2000

Posted November 14 2001 - 02:08 AM

Jason,

A bit, in the most simplistic term, is basically like a light switch, it's either "on" or "off". A bit is represented by either a "0" or a "1". The processor uses strings of these bits to determine what it does. A 32bit processor can process 32 of these 1's and 0's, while a 128bit can do 128 of them. But each processor is different in HOW they process them.

So basically it's all these different combinations of "on" and "off" that make a computer (or game console) operate. My knowledge of HOW they actually WORK is limited, but I think I explained the basics. Anyone care to explain further?

Mike D.

#10 of 12 Jason Handy

Jason Handy

    Second Unit

  • 383 posts
  • Join Date: Oct 03 2001

Posted November 14 2001 - 02:40 AM

I understand what a bit is. I guess I just do not understand what the bits are defining in graphics terms. What are those 256 bits doing to make the graphics look as good as they are?

Then again, maybe I do not understand what a bit is Posted Image

Jason
-------------------------
Enjoy Coke

#11 of 12 Mike__D

Mike__D

    Supporting Actor

  • 617 posts
  • Join Date: Dec 27 2000

Posted November 14 2001 - 02:59 AM

The more bits a graphics processor can handle, the more information it can move. And that results in more detailed graphics.

But it doesn't always come down to the bits. Processor speed, memory, and memory bandwidth are important factors. As well as the graphic routines they support. Both the Gamecube and Xbox have Transform & Lighting built in. Basically the the graphics processor can do the geometery and lighting effects, freeing up the main CPU, where as on the PS2 and Dreamcast, the programmer has to do it in the software and it taxes the CPU.

Hope this helps!

Mike D.


#12 of 12 Jason Handy

Jason Handy

    Second Unit

  • 383 posts
  • Join Date: Oct 03 2001

Posted November 14 2001 - 04:36 AM

Mike,
Thanks for the very informative discussion about bits. It is truly unfortunate that we have such loosely defined terms with ambiguous meaning - and more unfortunate that these terms are sometimes used as a benchmark to sell game machines to J6P.

Jason
-------------------------
Enjoy Coke