Believe me, I was very hesitant to blame this on the driver. I know that
what I’m doing is very new to me so I naturally assumed what was happening
was my fault.
Here’s what’s happening:
I’m drawing into a off screen context and blitting the image to a PtRaw
Widget. When I fill up the screen I blit the off screen context left by
200 pixels. That works fine. I then refill those right most 200 pixels
with 2 color stripes. Here is that code:
void colorize_horizontal_data_channels ( int left, int right )
{
static const PgColor_t channel_colors[ 2 ] = { PgRGB( 215, 255, 255 ), PgRGB( 255, 215, 215 ) };
PhRect_t sample_rect;
sample_rect.ul.x = left;
sample_rect.lr.x = right;
sample_rect.ul.y = 0;
bool color_count = 0;
for( int nc = 0; nc < NUM_SC_CHANNELS; nc++ )
{
// determine which data channels need to be displayed
if( format.formats[ nc ].ch != NOT_USED
&& format.formats[ nc ].osd_pixels )
{
PgSetFillColor( channel_colors[ color_count ] );
sample_rect.lr.y = sample_rect.ul.y + format.formats[ nc ].osd_pixels - 1;
PgDrawRect( &sample_rect, Pg_DRAW_FILL );
sample_rect.ul.y = sample_rect.lr.y + 1;
color_count = !color_count;
}
}
}
This same code in called at startup and works fine. But when it’s called
to replace just the right most 200 pixels is’t as if it isn’t even
happening. I have verified that it IS being called.
What finally convinced me that it was a driver problem and not my code is
this: I copied the whole application onto my laptop to take it home to
work on. But when I got home, it worked the very first time without any
modification.
Here are the details:
I am running the card at 1280x1024 in 32 bit resolution.
My PtRaw widget takes up most of the screen, 1263x929,
just in case this matters.
Desktop at work: 2.4 GHz w/1GB RAM
Graphics card:
Class = Display (VGA)
Vendor ID = 1002h, ATI Technologies
Device ID = 4750h, Rage 3D Pro PCI Graphics Accelerator
PCI index = 0h
Class Codes = 030000h
Revision ID = 5ch
Bus number = 2
Device number = 7
Function num = 0
Status Reg = 280h
Command Reg = 86h
Header type = 0h Single-function
BIST = 0h Build-in-self-test not supported
Latency Timer = 40h
Cache Line Size= 10h un-cacheable
PCI Mem Address = f8000000h prefetchable 32bit length 16777216 enabled
PCI IO Address = e800h length 256 disabled
PCI Mem Address = ff8fe000h 32bit length 4096 enabled
Subsystem Vendor ID = 1002h
Subsystem ID = 80h
PCI Expansion ROM = 80000000h length 131072 disabled
Max Lat = 0ns
Min Gnt = 8ns
PCI Int Pin = INT A
Interrupt line = 11
CPU Interrupt = bh
pidin -Pio-graphics me says:
479264 1 on/bin/io-graphics 12r RECEIVE 88K 4676K 12K(516K)*
479264 2 on/bin/io-graphics 12r REPLY 88K 4676K 4096(516K)
479264 3 on/bin/io-graphics 12r REPLY 88K 4676K 8192(132K)
ldqnx.so.2 @b0300000 344K 16K
libgri.so.2 @b8200000 40K 4096
libfont.so.1 @b820b000 40K 4096
libm.so.2 @b8216000 104K 8192
libdisputil.so.2 @b8232000 32K 8192
phfont.so @b823c000 124K 20K
tharnessutils.so.1 @b8260000 16K 4096
FCcore.so @b8265000 32K 8192
libblkcache.so.2 @b826f000 12K 4096
libFF-T2K-fm.so.1 @b8273000 12K 4096
bFF-T2K-cache.so.2 @b8277000 8192 4096
libFF-T2K.so.2 @b827a000 232K 12K
PHFcore.so @b82b7000 24K 4096
libfontutils.so.1 @b82be000 4096 4096
ttfFFcore.so @b82c0000 36K 4096
devg-rage.so @b82ca000 28K 4096
libffb.so.2 @b82d2000 44K 4096
gri-photon.so @b82de000 56K 8192
libphrender.so.2 @b82ee000 204K 8192
/dev/mem @40100000 ( 0) 4096
/dev/mem @40101000 ( 10000) 128K
/dev/mem @40121000 ( 0) 8192K
/dev/mem @40921000 (f87ff000) 4096
/dev/mem @40922000 ( 0) 32K
/dev/mem @4092a000 ( 0) 3840K
Mt laptop system is using an ATI Technologies (VendorD 1002),
Mobility M3 AGP 2x (DeviceID 4c46)
Driver devg-vesabios.so (The new and improved version that what’s his
name did). And it is running in 1024x768x32 bit mode.
So, whay can’t I reinitialize those right most 200 bytes but only on the
desktop system?
Is there a circumvention?
If it will help I can tarball up the whole application with a data file
and send it along. I warn you, it IS quite large.