Search code examples
assemblynasm32-bit

How to change VGA in 32-bit NASM Assembly


I am trying to change the VGA in this code here. People say I can use int 10h, however, I get an error in virtual box. I want to set the screen by any kind of vga at this point. I managed to setup the Globa Description Table and write to the 80x25 screen, but not initialize VGA. How do I do so?

Code:

;===DATA============================================================================================
[bits   16]
[org    0x500]

jmp boot

;===VARIABLES======================================================================================
gdt_start:
    dd  0
    dd  0

    dw  0xFFFF
    dw  0
    db  0
    db  10011010b
    db  11001111b
    db  0

    dw  0xFFFF
    dw  0
    db  0
    db  10010010b
    db  11001111b
    db  0
gdt_end:
GDT_loader:
    dw  gdt_end - gdt_start - 1
    dd  gdt_start

;===CODE============================================================================================
load_GDT:
    pusha
    cli
    lgdt    [GDT_loader]
    sti
    popa
    ret

boot:
    ; Initalize the GDT
    ; Setup Stacks
    cli
    mov ax, 0x0000
    mov ss, ax
    mov sp, 0xFFFF
    sti

    ; Clear segment registers
    mov ax, 00h
    mov ds, ax
    mov es, ax
    mov fs, ax
    mov gs, ax

    call load_GDT

    cli
    mov eax, cr0
    or  eax, 1
    mov cr0, eax
    jmp 08h:kernel

;===VARIABLES=======================================================================================
bootingmessage      db "Starting up", 0x00

;===DATA============================================================================================
[bits   32]
;===CODE============================================================================================
kernel:
    mov ax, 10h
    mov ds, ax
    mov es, ax
    mov ss, ax
    mov esp, 0x900000

    cli

    ;*****************
    ;Setup VGA Here
    ;*****************

    jmp $

Solution

  • I'll have to assume that the reason you get an error (in VirtualBox) when you try to use int 0x10 is that you're doing it while in protected mode (and BIOS functions including int 0x10 expect to be executed in real mode). Otherwise, it should've worked perfectly in VirtualBox.

    The alternatives are:

    • set the video mode during boot (before you switch to protected mode)

    • use virtual8086 mode (to execute real mode code while still in protected mode)

    • use an emulator or interpreter (to execute/interpret real mode code while in protected mode)

    • switch to real mode temporarily every time you change the video mode. This is an extremely bad idea (prevents native drivers from functioning properly due to IRQs being received while not in protected mode).

    • write a native driver for "generic VGA".

    • write a native driver for every different video card

    For all of these options; I'd say that the first and last are the only sane ones. The others are a complete waste of time that won't be useful in the long-term - anything that depends on real mode BIOS functions after boot will not be useful on modern (UEFI) computers; and VGA is so ugly (extremely poor colour depth and resolution) that it makes people's eyes bleed (and assumes "100% VGA compatible at the hardware level with no bugs, quirks or incompatibilities" which is a relatively dodgy assumption given that actual VGA cards haven't existed for 30+ years and VGA is only something real video cards grudgingly tolerate for backward compatibility purposes).

    More specifically; your boot loader should probably set a video mode during boot (using VBE if the boot loader is designed for BIOS, and using GOP or UGA if the boot loader is designed for UEFI) and tell the OS/kernel the relevant details (address of frame buffer, horizontal and vertical resolution, colour depth and pixel format, bytes per scan line); and the OS should use this information to provide a "raw frame buffer only" driver (until/unless it loads a native driver for the specific video card/GPU).