Where is binary digits converted to ASCII kodes?
Okay need an answer quick here please, going to talk about how a computer works on school today, so i need to get this straightened.
Where is the binary digits converted to the ASCII kodes?
It my book it says in the memmorychip, but is this true? whyhow?
a computer don''t understand anything but 0 and 1, and everything is transferred via binary digits, how can it then suddenly have ASCII kodes in the memmory?
-----------------------------
- Datatubbe
They didn''t accept me in the Tubby World!
------------------------------ DatatubbeThey didn't accept me in the Tubby World!
The binaries are never converted to ASCII. Its up to the software to read a byte (0 .. 255 or eight 1/0's) and display the corresponding letter on the screen.
If you're looking for a piece of hardware, the CPU would be the place.
Say we wish to store the letter "a" in memory. According to the ASCII table in front of me "a" has the decimal code 65.
Now 65 in binary would be: 01000001, so this is what we store in memory.
Edited by - newdeal on February 6, 2002 5:30:22 AM
If you're looking for a piece of hardware, the CPU would be the place.
Say we wish to store the letter "a" in memory. According to the ASCII table in front of me "a" has the decimal code 65.
Now 65 in binary would be: 01000001, so this is what we store in memory.
Edited by - newdeal on February 6, 2002 5:30:22 AM
yeah that''s true, but where on earth is the A converted to what i see. Like u see this "A" here right? i don''t see any binary digits, i see letters, where is this transformed?
this is what i mean.
-----------------------------
- Datatubbe
They didn''t accept me in the Tubby World!
this is what i mean.
-----------------------------
- Datatubbe
They didn''t accept me in the Tubby World!
------------------------------ DatatubbeThey didn't accept me in the Tubby World!
February 06, 2002 05:17 AM
in text mode(no graphics) the is an area of memory called the video memory and every other byte contains one of these ascii codes, now what happens is the vga card has a lookup table which stores the pixel data(series of dots) for each ascii code. when the vga card encounters 65 on memory it looks up the pixel data for A and draws the appropriate dots to the screen to make the letter appear. in graphics modes the video memory stores which pixels should be on/off and the lookup table is in software and it is called a font, when you want an A to appear the software looks up the pixel data in its font table and copies that pixel data to video memory, the the vga card draws whatever is in video memory to the screen.
i think thats approximately correct
i think thats approximately correct
Yes, internally everything is in 1s and 0s, but its not a concern of yours. ASCII codes just allow 255 characters to be represented using special codes for them. Even keys like Backspace and Enter have codes(8 and 13, respectively). The Operating system(windows, Linux) is the one who converts this ASCII code into an actual character.
-----------------------------
The sad thing about artificial intelligence is that it lacks artifice and therefore intelligence.
-----------------------------
The sad thing about artificial intelligence is that it lacks artifice and therefore intelligence.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
My frined, you are not going to se all the 1s'' and 0s'' here bacause 1 and 0 have a binary code foe themselves. Never you must convert any numbers from binary to desmal in software, you read it out of memory, and the memory does all the decoding/coding for ya''. Isnt that sweet fot Intel and them to do that for ya. when I type "A" here, you are not going to see 1/0 bacause its transparent, and because its base 2 math, it wouldn make sence to show it anyway, bacause you would see 65. Maby in some other dimension you might see onother number. I think there is a quote that says "You can be a profesional programer, and still not know anything about binary numbers".
Here are some interasting macrows I found in the Zen engine:
///////////////////////////////////////////////////////////////////////////
// USEFULL MACROS
///////////////////////////////////////////////////////////////////////////
//
// zBin - Is use to conver from a constant binary number to a regular
// decinal number. Example: zBin( 111 ) == 7. zBin can only do 16bits
// worth of binary number. So it does the lower 16bits of a dword.
//
// zBinH - Works just as zBin but it does the upper 16bits of a dword.
//
// _ZBIN - It is a private macro. So it is not advisable to use it.
//
///////////////////////////////////////////////////////////////////////////
#define _ZBIN( A, L ) ( (u32)(((((u64)0##A)>>(3*L)) & 1)<#define zBin( N ) ( _ZBIN( N, 0 ) | _ZBIN( N, 1 ) | _ZBIN( N, 2 ) | _ZBIN( N, 3 ) | \
_ZBIN( N, 4 ) | _ZBIN( N, 5 ) | _ZBIN( N, 6 ) | _ZBIN( N, 7 ) | \
_ZBIN( N, 8 ) | _ZBIN( N, 9 ) | _ZBIN( N, 10 ) | _ZBIN( N, 11 ) | \
_ZBIN( N, 12 ) | _ZBIN( N, 13 ) | _ZBIN( N, 14 ) | _ZBIN( N, 15 ) )
#define zBinH( N ) ( zBin( N ) << 16 )
- err, the last signiture sucked bigtime!
Here are some interasting macrows I found in the Zen engine:
///////////////////////////////////////////////////////////////////////////
// USEFULL MACROS
///////////////////////////////////////////////////////////////////////////
//
// zBin - Is use to conver from a constant binary number to a regular
// decinal number. Example: zBin( 111 ) == 7. zBin can only do 16bits
// worth of binary number. So it does the lower 16bits of a dword.
//
// zBinH - Works just as zBin but it does the upper 16bits of a dword.
//
// _ZBIN - It is a private macro. So it is not advisable to use it.
//
///////////////////////////////////////////////////////////////////////////
#define _ZBIN( A, L ) ( (u32)(((((u64)0##A)>>(3*L)) & 1)<#define zBin( N ) ( _ZBIN( N, 0 ) | _ZBIN( N, 1 ) | _ZBIN( N, 2 ) | _ZBIN( N, 3 ) | \
_ZBIN( N, 4 ) | _ZBIN( N, 5 ) | _ZBIN( N, 6 ) | _ZBIN( N, 7 ) | \
_ZBIN( N, 8 ) | _ZBIN( N, 9 ) | _ZBIN( N, 10 ) | _ZBIN( N, 11 ) | \
_ZBIN( N, 12 ) | _ZBIN( N, 13 ) | _ZBIN( N, 14 ) | _ZBIN( N, 15 ) )
#define zBinH( N ) ( zBin( N ) << 16 )
- err, the last signiture sucked bigtime!
Rate me up.
quote: yeah that''s true, but where on earth is the A converted to what i see. Like u see this "A" here right? i don''t see any binary digits, i see letters, where is this transformed?
this is what i mean.
Here''s what happens. For you''re example specifically, for the "A" showing up on the screen.
When the program needs to put characters on the screen, it gets the next ascii code from memory (code 65 in this case), and then it looks up that ascii code in it''s table of fonts. Now, normally, these fonts are just bit-masks, so, at the place where ascii code 65 is stored you''ll see something like this:
00000001100000000000001111000000000001100110000000001100001100000001111111111000001100000000110001100000000001101100000000000011
So, it will take that font, and scan it. bit-by-bit. If it encounters a 0, it won''t put a pixel at the corresponding place on the screen. If it finds a 1, it will put a pixel up.
Hope that clears things up for you.
Nutts
My Gamedev Journal: 2D Game Making, the Easy Way
---(Old Blog, still has good info): 2dGameMaking
-----
"No one ever posts on that message board; it's too crowded." - Yoga Berra (sorta)
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement