How computers display raw, low-level text and graphics

8,958

Solution 1

That's (partly) the role of the BIOS.

The Basic Input Output System of the computer is responsible for providing a common interface to operating systems, despite such differences between actual computers.

That said, for graphics specifically, there are different ways of drawing to the screen. There are TTY commands that you can send to the BIOS, but that's only in real mode. If you want to draw anything in protected mode, you need to use VGA to draw things. I can't explain it better than OSDev, so look here for more info -- but basically, you can write to memory (video memory is memory-mapped) starting at address 0xB8000 to draw things on the screen.

If you need higher resolution than VGA, you need to use the VESA BIOS extensions; I'm not familiar with it, but try looking at the GRUB source code for more info.

Some useful references:


If you happen to be familiar with D -- I wrote a little boot loader a while back that was able to write to the screen (text only). If you're interested, here's the code:

align(2) struct Cell { char ch; ubyte flags = 0x07; }

@property Cell[] vram()
{ return (cast(Cell*)0xB8000)[0 .. CONSOLE_WIDTH * CONSOLE_HEIGHT]; }

void putc(char c)
{
    if (isBochs) { _outp(0xE9, c); }  // Output to the Bochs terminal!

    bool isNewline = c == '\n';
    while (cursorPos + (isNewline ? 0 : 1) > vram.length)
    {
        for (short column = CONSOLE_WIDTH - 1; column >= 0; column--)
        {
            foreach (row; 0 .. CONSOLE_HEIGHT - 1)
            {
                uint cell = column + cast(uint)row * CONSOLE_WIDTH;
                vram[cell] = vram[cell + CONSOLE_WIDTH];
            }
            vram[column + (CONSOLE_HEIGHT - 1) * CONSOLE_WIDTH].ch = ' ';
        }
        cursorPos = cast(ushort)(cursorPos - CONSOLE_WIDTH);
    }
    if (isNewline)
        cursorPos = cast(ushort)
            ((1 + cursorPos / CONSOLE_WIDTH) * CONSOLE_WIDTH);
    else vram[cursorPos++].ch = c;
}

void putc(char c, ubyte attrib) { vram[cursorPos] = Cell(c, attrib); }

void memdump(void* pMem, size_t length)
{
    foreach (i; 0 .. length)
        putc((cast(char*)pMem)[i]);
}

void clear(char clear_to = '\0', ubyte attrib = DEFAULT_ATTRIBUTES)
{
    foreach (pos; 0 .. vram.length)
        vram[pos] = Cell(clear_to, attrib);
    cursorPos = 0;
}

@property ushort cursorPos()
{
    ushort result = 0;
    _outp(0x3D4, 14);
    result += _inp(0x3D5) << 8;
    _outp(0x3D4, 15);
    result += _inp(0x3D5);
    return result;
}

@property void cursorPos(ushort position)
{
    _outp(0x3D4, 14);
    _outp(0x3D5, (position >> 8) & 0xFF);
    _outp(0x3D4, 15);
    _outp(0x3D5, position & 0xFF);
}

Solution 2

From the early days of the IBM PC and its clones, the display adapter hardware was very simple: a small block of memory was dedicated to a grid of character cells (80x25 characters in the standard mode), with two bytes of memory for each cell. One byte selected the character, and the other selected its "attributes" - foreground and background colors plus blink control for color adapters; bold, underlined, blinking, or reverse video for monochrome adapters. The video output hardware looked up pixels from a ROM table of character shapes according to the contents of character memory.

In order to offer a certain degree of hardware independence, the BIOS interface to the character map required a software interrupt to be executed in order to set a single character cell on the screen. This was slow and inefficient. However, the character memory was directly addressable by the CPU as well, so if you knew what hardware was present, you could write directly to memory instead. Either way, once set, the character would remain on screen until changed, and the total character memory you needed to work with was 4000 bytes - about the size of a single 32x32 full color texture today!

In the graphics modes, the situation was similar; each pixel on screen is associated with a particular location in memory, and there was a BIOS set-pixel interface but high performance work required writing directly to memory. Later standards like VESA let the system do a few slow BIOS-based queries to learn the memory layout of the hardware, then work directly with memory. This is how an OS can display graphics without a specialized driver, although modern OSes do also include basic drivers for every major GPU manufacturer's hardware. Even the newest NVidia card will support several different backwards compatibility modes, probably all the way back to IBM CGA.

One important difference between 3D graphics and 2D is that in 2D you don't generally need to redraw the entire screen every frame. In 3D, if the camera moves even a tiny bit, every pixel on the screen might change; in 2D, if you aren't scrolling, most of the screen will be unchanged frame-to-frame, and even if you are scrolling, you can generally do a fast memory-to-memory copy instead of recomposing the whole scene. So it's nothing like having to execute INT 10h for every pixel every frame.

Source: I'm really old

Solution 3

During boot the system BIOS looks for the video adapter. In particular, it looks for the video adapter's built in BIOS program and runs it. This BIOS is normally found at location C000h in memory. The system BIOS executes the video BIOS, which initializes the video adapter.

Which levels or modes of video/graphics the BIOS can display natively, without an OS or drivers, is primarily dependant on the Video BIOS itself.

Source/More Info Here - "System Boot Sequence"

Solution 4

Our computers, at boot, as far as I understand it, are in text mode, in which a character can be displayed using the software interrupt 0x10 when AH=0x0e

You're talking about legacy BIOS functions. In fact you don't need to use such functions at all. You directly write them into video memory.

how on earth do computers output graphics at the lowest level, say, below the OS?

This strongly has to do on how the OS operates. Anyway, the operation is the same at the hardware level: video card has a video RAM, which stores (simplifying) the next image to be drawn on the screen. You can think that each address is a byte which represents a pixel (actually generally you need more than one byte per pixel). Then the video controller takes care of translating this into a signal the monitor understands.

Is there a standard that defines basic outputting of vertices, polygons, fonts, etc. (below OpenGL for example, which OpenGL might use)?

AFAIK, no. There are no standards about logical graphic representations.

What makes me ask is why OS' can often be fine without official drivers installed; how do they do that?

Because it has drivers already bundled with the OS.

Share:
8,958

Related videos on Youtube

Doddy
Author by

Doddy

Updated on September 18, 2022

Comments

  • Doddy
    Doddy almost 2 years

    My ever-growing interest in computers is making me ask deeper questions, that we don't seem to have to ask anymore. Our computers, at boot, as far as I understand it, are in text mode, in which a character can be displayed using the software interrupt 0x10 when AH=0x0e. We've all seen the famous booting font that always looks the same, regardless of what computer is booting.

    So, how on earth do computers output graphics at the lowest level, say, below the OS? And also, surely graphics aren't outputted a pixel at a time using software interrupts, as that sounds very slow?

    Is there a standard that defines basic outputting of vertices, polygons, fonts, etc. (below OpenGL for example, which OpenGL might use)? What makes me ask is why OS' can often be fine without official drivers installed; how do they do that?

    Apologies if my assumptions are incorrect. I would be very grateful for elaboration on these topics!

    • hippietrail
      hippietrail over 12 years
      It's not all computers, it depends on the display hardware. PCs based on the IBM evolutionary line originally just had text mode, then various graphics modes were added. We still have that legacy. Older computers often had only text mode. Other computers had only graphics modes, like the Amiga and the original Macintoshes.
  • Doddy
    Doddy over 12 years
    Thanks for the answer. Why the GRUB source code? I didn't know GRUB is involved with graphics itself.
  • badboy24
    badboy24 over 12 years
    @panic Grub2 can do some interesting things, including displaying an image as the background when on the OS selection menu. Not certain about grub.
  • user541686
    user541686 over 12 years
    @panic: I'm not sure if it's GRUB or GRUB2 (I used "GRUB" generically), but they can definitely switch into a native 1600x900 resolution on my monitor, so I'm sure they would have things you can look into. (I've looked into them a while back, and I remember they were helpful, but I it's been a while so I don't remember exactly which file(s) you should look in.)
  • Doddy
    Doddy over 12 years
    Very informative, thanks. I wish I could accept more than one!
  • n611x007
    n611x007 almost 11 years
    does the term memory-mapped mean that there is a "buffer" area in the RAM where the CPU (or is there another DMA unit?) can read and write on behalf of both the graphics card and the program that tries to write to the display?
  • user541686
    user541686 almost 11 years
    @naxa: Good question... usually, memory mapped hardware means that there is an address in memory that doesn't actually correspond to RAM. Rather, reading or writing to it corresponds to performing some action on some hardware. That said, I'm not sure how you could distinguish between actual RAM and hardware that makes a block of memory look like RAM... it could be that it's actual RAM in this case, but my impression was that reads and writes are intercepted by hardware.