ich habe seit ca. 3 Jahren einen ThinkPad X240 (Haswell-ULV-CPU, i7-4600U). Bis vor kurzem lief er unter Debian Jessie und da kam mit die Grafikleistung schon recht schwach vor, aber nach dem Upgrade auf Stretch erscheint es mir noch langsamer.
Die GPU ist ein Intel HD Graphics 4400.
Code: Alles auswählen
$ uname -a
Linux olymp 4.9.0-4-amd64 #1 SMP Debian 4.9.51-1 (2017-09-28) x86_64 GNU/Linux
Code: Alles auswählen
$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 0b)
Code: Alles auswählen
$ sudo lspci -vk|grep -A 10 VGA
[sudo] Passwort für bnc:
00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 0b) (prog-if 00 [VGA controller])
Subsystem: Lenovo ThinkPad X240
Flags: bus master, fast devsel, latency 0, IRQ 44
Memory at f0000000 (64-bit, non-prefetchable) [size=4M]
Memory at e0000000 (64-bit, prefetchable) [size=256M]
I/O ports at 3000 [size=64]
[virtual] Expansion ROM at 000c0000 [disabled] [size=128K]
Capabilities: [90] MSI: Enable+ Count=1/1 Maskable- 64bit-
Capabilities: [d0] Power Management version 2
Capabilities: [a4] PCI Advanced Features
Kernel driver in use: i915
Code: Alles auswählen
$ glxgears
9628 frames in 5.0 seconds = 1925.571 FPS
9217 frames in 5.0 seconds = 1843.379 FPS
Code: Alles auswählen
$ glxinfo | grep render
direct rendering: Yes
GLX_MESA_multithread_makecurrent, GLX_MESA_query_renderer,
GLX_MESA_multithread_makecurrent, GLX_MESA_query_renderer,
Extended renderer info (GLX_MESA_query_renderer):
OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.9, 256 bits)
GL_ARB_conditional_render_inverted, GL_ARB_conservative_depth,
GL_NV_conditional_render, GL_NV_depth_clamp, GL_NV_packed_depth_stencil,
GL_ARB_conditional_render_inverted, GL_ARB_conservative_depth,
GL_NV_conditional_render, GL_NV_depth_clamp, GL_NV_fog_distance,
GL_OES_element_index_uint, GL_OES_fbo_render_mipmap,
Code: Alles auswählen
$ sudo dmesg | grep drm
[ 1.084477] [drm] Initialized
[ 1.113929] [drm] Memory usable by graphics device = 2048M
[ 1.113931] [drm] Replacing VGA console driver
[ 1.120450] [drm] Supports vblank timestamp caching Rev 2 (21.10.2013).
[ 1.120450] [drm] Driver supports precise vblank timestamp query.
[ 1.330872] [drm] Initialized i915 1.6.0 20160919 for 0000:00:02.0 on minor 0
[ 1.517545] fbcon: inteldrmfb (fb0) is primary device
[ 2.624861] i915 0000:00:02.0: fb0: inteldrmfb frame buffer device
Code: Alles auswählen
...loading libGL.so.1:
Calling SDL_Init(SDL_INIT_VIDEO)...
SDL_Init(SDL_INIT_VIDEO) passed.
Initializing OpenGL display
...setting mode 13: 1366 768
Using 8/8/8 Color bits, 24 depth, 0 stencil display.
GL_RENDERER: Gallium 0.4 on llvmpipe (LLVM 3.9, 256 bits)
Initializing OpenGL extensions
...ignoring GL_S3_s3tc
...ignoring GL_EXT_texture_env_add
...using GL_ARB_multitexture
...using GL_EXT_compiled_vertex_array
...GL_EXT_texture_filter_anisotropic not found
GL_VENDOR: VMware, Inc.
GL_RENDERER: Gallium 0.4 on llvmpipe (LLVM 3.9, 256 bits)
GL_VERSION: 3.0 Mesa 13.0.6
GL_MAX_TEXTURE_SIZE: 8192
GL_MAX_ACTIVE_TEXTURES_ARB: 8
PIXELFORMAT: color(24-bits) Z(24-bit) stencil(0-bits)
MODE: 13, 1366 x 768 fullscreen hz:N/A
Ich habe u. a. diese Zeilen in der /var/log/Xorg.0.log. Die Zeile mit dem EE sieht nicht so gut aus? Ist das Die Ursache?
Code: Alles auswählen
[ 4.545] (WW) modeset(0): Option "SwapbuffersWait" is not used
[ 4.545] (WW) modeset(0): Option "TearFree" is not used
[ 4.545] (WW) modeset(0): Option "DRI" is not used
[ 4.545] (--) RandR disabled
[ 4.547] (II) SELinux: Disabled on system
[ 4.547] (II) AIGLX: Screen 0 is not DRI2 capable
[ 4.547] (EE) AIGLX: reverting to software rendering
[ 4.668] (II) IGLX: enabled GLX_MESA_copy_sub_buffer
[ 4.668] (II) IGLX: Loaded and initialized swrast
[ 4.668] (II) GLX: Initialized DRISWRAST GL provider for screen 0
Muß ich irgendwo was einstellen?
Oder wird der falsche Treiber verwendet?
Viele Grüße
Christian