External Monitor on Laptop with 'Optimus technology' dual GPU's - how does it work?

Hi,
following on from my other thread on my work desktop installation I am now wondering ‘how it works’ on my ASUS X550 LB lappy (all working fine - just curious)

From here

GPU Performance

The X550LB uses Nvidia’s Optimus technology to switch between the integrated Intel HD Graphics 4400 and the dedicated Geforce GT 740 GPU. During less performance-intensive tasks, the system relies on the Intel GPU but switches to the dedicated GPU for 3D tasks such as gaming. Alternatively, the user can decide which application uses which GPU or choose one or the other GPU to take over all graphics tasks. The GT 740 is a middle-class DirectX 11 GPU with a base clock speed of 980 MHz and a Turbo speed of 1058 MHz.

So, what happens on my AL Openbox installation?

leigh@archlabs ~ % uname -a
Linux archlabs 5.18.12-arch1-1 #1 SMP PREEMPT_DYNAMIC Fri, 15 Jul 2022 15:33:02 +0000 x86_64 GNU/Linux

Does it do a similar thing to in Windows in terms of using the onboard until the Nvidia is needed, or am I only using one GPU, or something else?
Can I control this behaviour?

I seem to remember the word ‘Bumblebee’ but I have not installed that

My system:

leigh@archlabs ~ % lspci -v | grep -A1 -e VGA -e 3D
00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 09) (prog-if 00 [VGA controller])
	Subsystem: ASUSTeK Computer Inc. Device 131d
--
04:00.0 3D controller: NVIDIA Corporation GK208M [GeForce GT 740M] (rev a1)
	Subsystem: ASUSTeK Computer Inc. Device 131d
leigh@archlabs ~ % xrandr                     
Screen 0: minimum 8 x 8, current 1920 x 1848, maximum 32767 x 32767
eDP1 connected 1366x768+277+1080 (normal left inverted right x axis y axis) 340mm x 190mm
   1366x768      60.06*+
   1280x720      59.86    60.00    59.74  
   1024x768      60.00  
   1024x576      60.00    59.90    59.82  
   960x540       60.00    59.63    59.82  
   800x600       60.32    56.25  
   864x486       60.00    59.92    59.57  
   640x480       59.94  
   720x405       59.51    60.00    58.99  
   680x384       60.00  
   640x360       59.84    59.32    60.00  
DP1 disconnected (normal left inverted right x axis y axis)
HDMI1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 480mm x 270mm
   1920x1080     60.00*+  50.00    59.94    59.99  
   1920x1080i    60.00    50.00    59.94  
   1600x1200     60.00  
   1680x1050     59.88  
   1280x1024     75.02    60.02  
   1440x900      59.90  
   1280x960      60.00  
   1366x768      59.79  
   1152x864      75.00  
   1280x720      60.00    50.00    59.94  
   1024x768      75.03    70.07    60.00  
   832x624       74.55  
   800x600       72.19    75.00    60.32    56.25  
   720x576       50.00  
   720x480       60.00    59.94  
   640x480       75.00    72.81    66.67    60.00    59.94  
   720x400       70.08  
VIRTUAL1 disconnected (normal left inverted right x axis y axis)

https://wiki.archlinux.org/title/NVIDIA_Optimus

#Using Bumblebee - provides Windows-like functionality by allowing to run selected applications with NVIDIA graphics while using Intel graphics for everything else. Has significant performance issues.

Further reading:

https://www.cyberciti.biz/open-source/command-line-hacks/linux-gpu-monitoring-and-diagnostic-commands/

I have a lappy, but I only have it setup to use as a backup. I can’t even tell you what GPU is in it.

2 Likes

Thanks,
so from the below, I am only using my INTEL Integrated Graphics Controller?

leigh@archlabs ~ % xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x46 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 4 outputs: 4 associated providers: 1 name:Intel
Provider 1: id: 0xa8 cap: 0x5, Source Output, Source Offload crtcs: 0 outputs: 0 associated providers: 1 name:nouveau
leigh@archlabs ~ % xrandr -q
Screen 0: minimum 8 x 8, current 1920 x 1848, maximum 32767 x 32767
eDP1 connected 1366x768+277+1080 (normal left inverted right x axis y axis) 340mm x 190mm
   1366x768      60.06*+
   1280x720      59.86    60.00    59.74  
   1024x768      60.00  
   1024x576      60.00    59.90    59.82  
   960x540       60.00    59.63    59.82  
   800x600       60.32    56.25  
   864x486       60.00    59.92    59.57  
   640x480       59.94  
   720x405       59.51    60.00    58.99  
   680x384       60.00  
   640x360       59.84    59.32    60.00  
DP1 disconnected (normal left inverted right x axis y axis)
HDMI1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 480mm x 270mm
   1920x1080     60.00*+  50.00    59.94    59.99  
   1920x1080i    60.00    50.00    59.94  
   1600x1200     60.00  
   1680x1050     59.88  
   1280x1024     75.02    60.02  
   1440x900      59.90  
   1280x960      60.00  
   1366x768      59.79  
   1152x864      75.00  
   1280x720      60.00    50.00    59.94  
   1024x768      75.03    70.07    60.00  
   832x624       74.55  
   800x600       72.19    75.00    60.32    56.25  
   720x576       50.00  
   720x480       60.00    59.94  
   640x480       75.00    72.81    66.67    60.00    59.94  
   720x400       70.08  
VIRTUAL1 disconnected (normal left inverted right x axis y axis)
leigh@archlabs ~ % sudo lshw -numeric -c video
[sudo] password for leigh: 
  *-display                 
       description: VGA compatible controller
       product: Haswell-ULT Integrated Graphics Controller [8086:A16]
       vendor: Intel Corporation [8086]
       physical id: 2
       bus info: pci@0000:00:02.0
       version: 09
       width: 64 bits
       clock: 33MHz
       capabilities: msi pm vga_controller bus_master cap_list rom
       configuration: driver=i915 latency=0
       resources: irq:49 memory:f7400000-f77fffff memory:d0000000-dfffffff ioport:f000(size=64) memory:c0000-dffff
  *-display
       description: 3D controller
       product: GK208M [GeForce GT 740M] [10DE:1292]
       vendor: NVIDIA Corporation [10DE]
       physical id: 0
       bus info: pci@0000:04:00.0
       version: a1
       width: 64 bits
       clock: 33MHz
       capabilities: pm msi pciexpress bus_master cap_list rom
       configuration: driver=nouveau latency=0
       resources: irq:50 memory:f6000000-f6ffffff memory:e0000000-efffffff memory:f0000000-f1ffffff ioport:d000(size=128) memory:f7000000-f707ffff
leigh@archlabs ~ % glmark2    
=======================================================
    glmark2 2021.12
=======================================================
    OpenGL Information
    GL_VENDOR:      Intel
    GL_RENDERER:    Mesa Intel(R) HD Graphics 4400 (HSW GT2)
    GL_VERSION:     4.6 (Compatibility Profile) Mesa 22.1.3
    Surface Config: buf=32 r=8 g=8 b=8 a=8 depth=24 stencil=0
    Surface Size:   800x600 windowed
=======================================================
[build] use-vbo=false: FPS: 567 FrameTime: 1.764 ms
[build] use-vbo=true: FPS: 557 FrameTime: 1.795 ms
[texture] texture-filter=nearest: FPS: 539 FrameTime: 1.855 ms
[texture] texture-filter=linear: FPS: 554 FrameTime: 1.805 ms
[texture] texture-filter=mipmap: FPS: 554 FrameTime: 1.805 ms
[shading] shading=gouraud: FPS: 533 FrameTime: 1.876 ms
[shading] shading=blinn-phong-inf: FPS: 532 FrameTime: 1.880 ms
[shading] shading=phong: FPS: 518 FrameTime: 1.931 ms
[shading] shading=cel: FPS: 518 FrameTime: 1.931 ms
[bump] bump-render=high-poly: FPS: 483 FrameTime: 2.070 ms
[bump] bump-render=normals: FPS: 541 FrameTime: 1.848 ms
[bump] bump-render=height: FPS: 539 FrameTime: 1.855 ms
[effect2d] kernel=0,1,0;1,-4,1;0,1,0;: FPS: 496 FrameTime: 2.016 ms
[effect2d] kernel=1,1,1,1,1;1,1,1,1,1;1,1,1,1,1;: FPS: 384 FrameTime: 2.604 ms
[pulsar] light=false:quads=5:texture=false: FPS: 521 FrameTime: 1.919 ms
[desktop] blur-radius=5:effect=blur:passes=1:separable=true:windows=4: FPS: 348 FrameTime: 2.874 ms
[desktop] effect=shadow:windows=4: FPS: 440 FrameTime: 2.273 ms
[buffer] columns=200:interleave=false:update-dispersion=0.9:update-fraction=0.5:update-method=map: FPS: 362 FrameTime: 2.762 ms
[buffer] columns=200:interleave=false:update-dispersion=0.9:update-fraction=0.5:update-method=subdata: FPS: 389 FrameTime: 2.571 ms
[buffer] columns=200:interleave=true:update-dispersion=0.9:update-fraction=0.5:update-method=map: FPS: 401 FrameTime: 2.494 ms
[ideas] speed=duration: FPS: 542 FrameTime: 1.845 ms
[jellyfish] <default>: FPS: 459 FrameTime: 2.179 ms
[terrain] <default>: FPS: 131 FrameTime: 7.634 ms
[shadow] <default>: FPS: 387 FrameTime: 2.584 ms
[refract] <default>: FPS: 168 FrameTime: 5.952 ms
[conditionals] fragment-steps=0:vertex-steps=0: FPS: 548 FrameTime: 1.825 ms
[conditionals] fragment-steps=5:vertex-steps=0: FPS: 540 FrameTime: 1.852 ms
[conditionals] fragment-steps=0:vertex-steps=5: FPS: 532 FrameTime: 1.880 ms
[function] fragment-complexity=low:fragment-steps=5: FPS: 535 FrameTime: 1.869 ms
[function] fragment-complexity=medium:fragment-steps=5: FPS: 530 FrameTime: 1.887 ms
[loop] fragment-loop=false:fragment-steps=5:vertex-steps=5: FPS: 527 FrameTime: 1.898 ms
[loop] fragment-steps=5:fragment-uniform=false:vertex-steps=5: FPS: 430 FrameTime: 2.326 ms
[loop] fragment-steps=5:fragment-uniform=true:vertex-steps=5: FPS: 544 FrameTime: 1.838 ms
=======================================================
                                  glmark2 Score: 474 
=======================================================

Managed to run that test on the nvidia (using this)

leigh@archlabs ~ % DRI_PRIME=glmark2
leigh@archlabs ~ % DRI_PRIME=1 glmark2
=======================================================
    glmark2 2021.12
=======================================================
    OpenGL Information
    GL_VENDOR:      nouveau
    GL_RENDERER:    NV108
    GL_VERSION:     4.3 (Compatibility Profile) Mesa 22.1.3
    Surface Config: buf=32 r=8 g=8 b=8 a=8 depth=24 stencil=0
    Surface Size:   800x600 windowed
=======================================================
[build] use-vbo=false: FPS: 265 FrameTime: 3.774 ms
[build] use-vbo=true: FPS: 310 FrameTime: 3.226 ms
[texture] texture-filter=nearest: FPS: 289 FrameTime: 3.460 ms
[texture] texture-filter=linear: FPS: 288 FrameTime: 3.472 ms
[texture] texture-filter=mipmap: FPS: 295 FrameTime: 3.390 ms
[shading] shading=gouraud: FPS: 288 FrameTime: 3.472 ms
[shading] shading=blinn-phong-inf: FPS: 288 FrameTime: 3.472 ms
[shading] shading=phong: FPS: 288 FrameTime: 3.472 ms
[shading] shading=cel: FPS: 287 FrameTime: 3.484 ms
[bump] bump-render=high-poly: FPS: 266 FrameTime: 3.759 ms
[bump] bump-render=normals: FPS: 308 FrameTime: 3.247 ms
[bump] bump-render=height: FPS: 305 FrameTime: 3.279 ms
[effect2d] kernel=0,1,0;1,-4,1;0,1,0;: FPS: 261 FrameTime: 3.831 ms
[effect2d] kernel=1,1,1,1,1;1,1,1,1,1;1,1,1,1,1;: FPS: 216 FrameTime: 4.630 ms
[pulsar] light=false:quads=5:texture=false: FPS: 274 FrameTime: 3.650 ms
[desktop] blur-radius=5:effect=blur:passes=1:separable=true:windows=4: FPS: 175 FrameTime: 5.714 ms
[desktop] effect=shadow:windows=4: FPS: 201 FrameTime: 4.975 ms
[buffer] columns=200:interleave=false:update-dispersion=0.9:update-fraction=0.5:update-method=map: FPS: 203 FrameTime: 4.926 ms
[buffer] columns=200:interleave=false:update-dispersion=0.9:update-fraction=0.5:update-method=subdata: FPS: 232 FrameTime: 4.310 ms
[buffer] columns=200:interleave=true:update-dispersion=0.9:update-fraction=0.5:update-method=map: FPS: 203 FrameTime: 4.926 ms
[ideas] speed=duration: FPS: 294 FrameTime: 3.401 ms
[jellyfish] <default>: FPS: 252 FrameTime: 3.968 ms
[terrain] <default>: FPS: 62 FrameTime: 16.129 ms
[shadow] <default>: FPS: 263 FrameTime: 3.802 ms
[refract] <default>: FPS: 79 FrameTime: 12.658 ms
[conditionals] fragment-steps=0:vertex-steps=0: FPS: 293 FrameTime: 3.413 ms
[conditionals] fragment-steps=5:vertex-steps=0: FPS: 293 FrameTime: 3.413 ms
[conditionals] fragment-steps=0:vertex-steps=5: FPS: 292 FrameTime: 3.425 ms
[function] fragment-complexity=low:fragment-steps=5: FPS: 293 FrameTime: 3.413 ms
[function] fragment-complexity=medium:fragment-steps=5: FPS: 291 FrameTime: 3.436 ms
[loop] fragment-loop=false:fragment-steps=5:vertex-steps=5: FPS: 293 FrameTime: 3.413 ms
[loop] fragment-steps=5:fragment-uniform=false:vertex-steps=5: FPS: 293 FrameTime: 3.413 ms
[loop] fragment-steps=5:fragment-uniform=true:vertex-steps=5: FPS: 293 FrameTime: 3.413 ms
=======================================================
                                  glmark2 Score: 258 
=======================================================

So does this mean my INTEL is better?
That doesnt make much sense to me
Isnt the nvidia GPU included as an addition to the lappy as a better video controller (or whatever its called)?

If I used the nvidea would it 'unload my CPU &/or memory?

How can I get the nvidea to be used, and is that global or app by app, or something else?

sorry for lots of quickfire Q’s - I’m tired and dont have 'nergy to be eloquent

Boot the MX Linux Workbench thumb drive and run Quick System Info.

1 Like

will do, but on the 'morrow

1 Like

I don’t do lappies much so I don’t know much.

Look in the BIOS for settings on which GPU to toggle. Will also effect WIN performance; The Optimus thing. If you don’t run on battery very often, set to use NVIDIA.
I believe the onboard GPU uses system memory, leaving less for everything else. The NVIDIA GPU probably has it’s own memory.
The NVIDIA has better performance, generally, if used with their drivers. YMMV.

also: Be careful of, “If it ain’t broke, fix it till it breaks.”

1 Like

My mantra
but hopefully Timeshift should be my saviour

Workbench is nice

Output below, but I cant see where it says which GPU is used / master or whatever

System:    Kernel: 5.10.0-13-amd64 x86_64 bits: 64 compiler: gcc v: 10.2.1 
           parameters: BOOT_IMAGE=/antiX/vmlinuz quiet splasht nosplash lang=en_US kbd=pt 
           tz=America/New_York 
           Desktop: Xfce 4.16.0 tk: Gtk 3.24.24 info: xfce4-panel wm: xfwm 4.16.1 vt: 7 
           dm: LightDM 1.26.0 Distro: MX-21.1_Workbench_x64 Wildflower April 16  2022 
           base: Debian GNU/Linux 11 (bullseye) 
Machine:   Type: Laptop System: ASUSTeK product: X550LB v: 1.0 serial: <filter> 
           Mobo: ASUSTeK model: X550LB v: 1.0 serial: <filter> UEFI: American Megatrends 
           v: X550LB.403 date: 06/26/2014 
Battery:   ID-1: BAT0 charge: 28.9 Wh (96.3%) condition: 30.0/44.2 Wh (67.8%) volts: 14.4 
           min: 14.4 model: ASUSTeK X550A30 type: Li-ion serial: N/A status: Charging 
           cycles: 432 
CPU:       Info: Dual Core model: Intel Core i5-4200U socket: rPGA988B bits: 64 type: MT MCP 
           arch: Haswell family: 6 model-id: 45 (69) stepping: 1 microcode: 10 cache: 
           L1: 128 KiB L2: 3 MiB L3: 3 MiB 
           flags: avx avx2 lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx bogomips: 18356 
           Speed: 1034 MHz min/max: 800/2600 MHz base/boost: 1600/3800 volts: 1.2 V 
           ext-clock: 100 MHz Core speeds (MHz): 1: 1034 2: 945 3: 964 4: 948 
           Vulnerabilities: Type: itlb_multihit status: KVM: VMX disabled 
           Type: l1tf mitigation: PTE Inversion; VMX: conditional cache flushes, SMT vulnerable 
           Type: mds 
           status: Vulnerable: Clear CPU buffers attempted, no microcode; SMT vulnerable 
           Type: meltdown mitigation: PTI 
           Type: spec_store_bypass status: Vulnerable 
           Type: spectre_v1 mitigation: usercopy/swapgs barriers and __user pointer sanitization 
           Type: spectre_v2 mitigation: Retpolines, STIBP: disabled, RSB filling 
           Type: srbds status: Vulnerable: No microcode 
           Type: tsx_async_abort status: Not affected 
Graphics:  Device-1: Intel Haswell-ULT Integrated Graphics vendor: ASUSTeK driver: i915 
           v: kernel bus-ID: 00:02.0 chip-ID: 8086:0a16 class-ID: 0300 
           Device-2: NVIDIA GK208M [GeForce GT 740M] vendor: ASUSTeK driver: N/A 
           alternate: nouveau bus-ID: 04:00.0 chip-ID: 10de:1292 class-ID: 0302 
           Device-3: Chicony USB2.0 HD UVC WebCam type: USB driver: uvcvideo bus-ID: 2-5:5 
           chip-ID: 04f2:b40a class-ID: 0e02 serial: <filter> 
           Display: x11 server: X.Org 1.20.11 compositor: xfwm4 v: 4.16.1 driver: 
           loaded: modesetting unloaded: fbdev,vesa display-ID: :0.0 screens: 1 
           Screen-1: 0 s-res: 3286x1080 s-dpi: 96 s-size: 869x285mm (34.2x11.2") 
           s-diag: 915mm (36") 
           Monitor-1: eDP-1 res: 1366x768 hz: 60 dpi: 101 size: 344x193mm (13.5x7.6") 
           diag: 394mm (15.5") 
           Monitor-2: HDMI-1 res: 1920x1080 hz: 60 dpi: 102 size: 477x268mm (18.8x10.6") 
           diag: 547mm (21.5") 
           OpenGL: renderer: Mesa DRI Intel HD Graphics 4400 (HSW GT2) v: 4.5 Mesa 20.3.5 
           compat-v: 3.0 direct render: Yes 
Audio:     Device-1: Intel Haswell-ULT HD Audio vendor: ASUSTeK driver: snd_hda_intel v: kernel 
           bus-ID: 00:03.0 chip-ID: 8086:0a0c class-ID: 0403 
           Device-2: Intel 8 Series HD Audio vendor: ASUSTeK driver: snd_hda_intel v: kernel 
           bus-ID: 00:1b.0 chip-ID: 8086:9c20 class-ID: 0403 
           Sound Server-1: ALSA v: k5.10.0-13-amd64 running: yes 
           Sound Server-2: PulseAudio v: 14.2 running: yes 
Network:   Device-1: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet vendor: ASUSTeK 
           driver: r8169 v: kernel port: e000 bus-ID: 02:00.1 chip-ID: 10ec:8168 class-ID: 0200 
           IF: eth0 state: up speed: 100 Mbps duplex: full mac: <filter> 
           Device-2: Qualcomm Atheros AR9485 Wireless Network Adapter vendor: Lite-On 
           driver: ath9k v: kernel modules: wl port: e000 bus-ID: 03:00.0 chip-ID: 168c:0032 
           class-ID: 0280 
           IF: wlan0 state: down mac: <filter> 
Drives:    Local Storage: total: 2.74 TiB used: 0 KiB (0.0%) 
           ID-1: /dev/sda maj-min: 8:0 vendor: Crucial model: CT1000MX500SSD1 
           family: Micron Client SSDs size: 931.51 GiB block-size: physical: 4096 B 
           logical: 512 B sata: 3.3 speed: 6.0 Gb/s type: SSD serial: <filter> rev: 033 
           temp: 31 C scheme: GPT 
           SMART: yes state: enabled health: PASSED on: 71d 11h cycles: 667 written: 9.95 TiB 
           ID-2: /dev/sdb maj-min: 8:16 type: USB vendor: Seagate model: Portable size: 1.82 TiB 
           block-size: physical: 4096 B logical: 512 B type: N/A serial: <filter> rev: 0712 
           scheme: GPT 
           SMART Message: A mandatory SMART command failed. Various possible causes. 
           ID-3: /dev/sdc maj-min: 8:32 type: USB vendor: SanDisk model: SDDR-B531 size: 7.4 GiB 
           block-size: physical: 512 B logical: 512 B type: N/A serial: <filter> rev: 2920 
           scheme: MBR 
           SMART Message: Unknown USB bridge. Flash drive/Unsupported enclosure? 
Swap:      Kernel: swappiness: 15 (default 60) cache-pressure: 100 (default) 
           ID-1: swap-1 type: partition size: 3.91 GiB used: 0 KiB (0.0%) priority: -2 
           dev: /dev/sda8 maj-min: 8:8 
Sensors:   System Temperatures: cpu: 53.0 C mobo: N/A 
           Fan Speeds (RPM): cpu: 2700 
Repos:     Packages: note: see --pkg apt: 1925 lib: 898 flatpak: 0 
           No active apt repos in: /etc/apt/sources.list 
           Active apt repos in: /etc/apt/sources.list.d/debian-stable-updates.list 
           1: deb http://deb.debian.org/debian bullseye-updates main contrib non-free
           Active apt repos in: /etc/apt/sources.list.d/debian.list 
           1: deb http://deb.debian.org/debian bullseye main contrib non-free
           2: deb http://security.debian.org/debian-security bullseye-security main contrib non-free
           Active apt repos in: /etc/apt/sources.list.d/mx.list 
           1: deb http://mirrors.rit.edu/mxlinux/mx-packages/mx/repo/ bullseye main non-free
Info:      Processes: 207 Uptime: 0m wakeups: 2 Memory: 11.15 GiB used: 985.4 MiB (8.6%) 
           Init: SysVinit v: 2.96 runlevel: 5 default: 5 tool: systemctl Compilers: gcc: 10.2.1 
           alt: 10 Shell: Zsh v: 5.8 running-in: quick-system-info-mx 
           inxi: 3.3.06 
Boot Mode: UEFI
Video Tweaks:
Detected possible Hybrid Graphics

No GPU toggle in BIOS, only graphics config is ‘DVMT Pre-allocation’ which I increased from 64M to 512M
I guess this is how much of memory is allocated to onboard GPU and since I have 12 GB memory that is never fully used I wacked it up

If you don’t have proprietary Nvidia drivers for your card then yes the Intel GPU will beat it hands down, nouveau is shit generally speaking, it runs but that’s about all. Can’t really blame them either, Nvidia just wouldn’t help so they had to reverse engineer everything.

3 Likes

Thanks @natemaia
That is really good to know
I dont think i need the nvidea as i dont game but…
Installing the Nvidea drivers is not technically a good idea (as in aside from any moral point of view)?