devi-elo input driver

I’m new to programming. I really need some help regarding this.

I been tryin to generate touch event to the devi-elo resource mgr without anyone touchin the touch screen

I started my touch screen driver as below

devi-elo -vvvvvvvv smartset -R fd -d/dev/ser1

when i touched the LEFT BOTTOM of the screen i was able to get its coordinates as below

Received : 0x55
Received : 0x54
Received : 0x2
Received : 0xf6
Received : 0x1
Received : 0x64
Received : 0x2
Received : 0xff
Received : 0
Received : 0xb1
X=5 Y=755 Z=0
Emitting buttons=00000004

In my code, i just
-put those 10 bytes in a Buffer and
-opened the serial port /dev/ser1 as WRITE only
-and i didn’t set any termios structure since the Port settings have already been set by devi-elo.
-Wrote the Buffer data into /dev/ser1

for which the driver displayed the received data as
Received : 0x55
Received : 0x41
SmartSet: unknown CMD BYTE (0x41)
Received : 0x30



was displayed

So you talked to the ELO and it talked back. The driver didn’t like what it said. This is no surprise, you sent it what it sent you. Do you know how to talk to the ELO? What do you mean when you say “tryin to generate touch event to the devi-elo resource mgr without anyone touchin the touch screen”. It might be easier to find out about Photon events and emitting them.

I tryin to automate the testing of a system. Say if i have set of events, fisrt touch this button at this co-ordinate and then another button at another co-ordinate and so on. The first time i’m doin this i’ll just capture the co-ordinates of these events and the next time i wanna do the same things i’ll just run my code and it will do the required stuffs for me. Also i want it to look like, the events have been generated by the driver itself.

“Photon events and emitting them” will i be able to see the button widget getting pressed(animation) and then as a result the Callback function is evoked. I don’t think its possible, only thing we can do will be we can directly call the Callback funcion.

Ok, now what you are trying to do is clear. So you record what the ELO sent to the driver. Now you must disconnect the ELO, and send the data to the driver from the serial port. You won’t do that by opening up the ELO input serial port and writing to it unless there is a loopback connector. Otherwise yYou need another serial port with a cable connected between them.

Ok… now i get it. So if i have to do this then i’ll have to remove the ELO and place a loop back connector. Ok then i won’t be able to do this because i’ll have to remove the serial port. In this system the VGA cable serial cable are coupled into a single cable. So then i’ll have to see how the abs module communictes it to Photon

You might be able to stop the ELO driver, then start it gain looking at another serial port.

The Photon idea starts to sound better and better. It is technically complex, but much more elegant. You would first put up a region that would detect but not stop Photon events from the ELO driver. Then you touch the screen and record the event as it passes by. Later when you are testing you emit the event.

Other problem would be all serial ports in the system are used for their own specific purposes. I won,t be able to add a serial port for the purpose of testing the system…
yes Masch… Photon idea must be the best for this case.

I left the driver part and was looking into the Photon Interface code they’ve used and API

int PhEventEmit( PhEvent_t const *event, PhRect_t const *rects, void const *data );

That is how you emit an event. You must emit an event from a region which you must first create. You also need to find out about the format of the event you want to emit. I believe that these will be “Mouse-Click” events. Like I said, a little complex, at least the first time.

Finding it difficult not sure were to start. Need help buddy. Haven’t worked with photon. how do i create these regions and stuff. Masch can u help me get started.

This is the part of code which handles the emission to photon

/* trig_ptr
*

  • Emit a raw pointer event into Photon event space.

*/

void
trig_ptr(PhRawPtrEvent_t *ptr_ev, unsigned group, PhRid_t rid)
{
PhEvent_t ev;
PhRect_t rect;

    if(rid == 0) {
            return;
    }

    memset(&ev, 0x00, sizeof ev);
    memset(&rect, 0x00, sizeof rect);

    ev.type = Ph_EV_RAW;
    ev.subtype = Ph_EV_RAW_PTR;

    ev.data_len = offsetof(PhRawPtrEvent_t, coord) + 
            ptr_ev->num_coord * sizeof(PhRawPtrCoord_t);

    ev.num_rects = 1;
    ev.input_group = group;
    ev.emitter.rid = rid;

    rect.ul.x = rect.ul.y = 0;
    rect.lr.x = rect.lr.y = 0;

    if(verbosity >= 4) {
            
            printf("X=%d Y=%d Z=%d\n", 
                   ((PhRawPtrCoord_t *)((char *)ptr_ev+offsetof(PhRawPtrEvent_t, coord)))->x,
                   ((PhRawPtrCoord_t *)((char *)ptr_ev+offsetof(PhRawPtrEvent_t, coord)))->y,
                   ((PhRawPtrCoord_t *)((char *)ptr_ev+offsetof(PhRawPtrEvent_t, coord)))->z);
            
            printf("Emitting buttons=%08X\n", ptr_ev->button_state);
    }

    PhEventEmit(&ev, &rect, ptr_ev);
    ptr_ev->num_coord = 0;

}

what are these structures PhRawPtrEvent_t, PhRawPtrCoord_t and offsetof(PhRawPtrEvent_t, coord)

Just to be on the safe side,

what are you trying to test? You won´t test the ELO itself with your app i guess.
Do you want to test the “PC/Computer/whatever” the ELO is conntected to?

If you want to check if your software works fine with the events your fine what you try now.
If you want to take a look at the seriel port or the elo or the hardware itself, you won´t get any success with your emitting Events.
I just want to be sure you are not going to code a loopback now, like testing elo-touch-devices by sending Photonevents to your software running on the system. This won´t do any good ^^. Neither would it show any hardwarerelated stuff, the only thing you could test with this, is the implementation and design of your UI and their corresponding callbacks and functions.

Another version would be to simulate variable touchevents, so events anywhere on the screen, which would be a kind of virtual mouse/touch testing how your application reacts. This is what could be done via Event emitting, isn´t it?

Ok this is what i wanna do…
We have the LAUNCH button on the Qnx Desktop. If you touch it on the touch screen it gets activated. Internally the Photon server uses PhEventEmit() to do this. Is it possible to write a code to simulate a touch event(with no one touching it) over the Launch buttton(and it gets activated).

Yes it is possible. Once you understand how the Photon kernel works, this is anything but magic. The mouse driver you are running does just this. It emits events from a region at some x,y coordinates The events travel through regions. If a region is sensitive to that event, it will get a QNX message about the event. A region can be transparent or opaque. If it is the latter, the event will stop. So what you need to create is a very simplified version of the mouse driver that emits the right event at the right location. Whatever program “Launches” will not be able to tell what the origin of the event is and so it will have to act the same way.

There is a nice introduction to how the Photon kernel works in the documentation and you should probably check it out before proceeding.

will actually move the cursor, then a PtHit could return the id of the widget which needs to emit the event then ^^
i remember doing this in a workaround in 4.25 days, moving 1 pixel right down, moving back up and hit…
i wonder what this was for…

PhMoveCursorAbs()
PhMoveCursorRel()
PtHit()

but since we already know the button has to emit the event, we could just do it, couldn´t we?

Moving the cursor might be nice for the asthetics, but it is no necessary to trigger the button.

It should be in the same way as a touch event

I created a region using PhRegionOpen() and then used the PhEmit() API to simulate the touch event for which i gave the captured co-ordinates which intersect with the Launch button. This time the Mouse pointer moved to on top of the Launch button but still the button wasn’t activated.

PhRid_t PRid;
PhRegion_t PRegion;
PhRect_t PRect;
char *photon_group;

memset(&PRegion, 0, sizeof(PRegion));
memset(&PRect, 0x0, sizeof(PRect));

//Filling up of PhRegion_t structure.
PRegion.events_sense = Ph_EV_SYSTEM | Ph_EV_SERVICE;
PRegion.flags = Ph_FORCE_FRONT | Ph_PTR_REGION;
PRegion.input_group = 1;

//Filling up of PhRect_t structure.
PRect.ul.x = 0;
PRect.ul.y = 0;
PRect.lr.x = 1023;
PRect.lr.y = 767;

PRid = PhRegionOpen(Ph_REGION_EV_SENSE |
Ph_REGION_FLAGS |
Ph_REGION_RECT |
Ph_REGION_INPUT_GROUP,
&PRegion,&PRect,NULL);

and then this is how i emmitted the raw ptr event using the PhEmit()

//Captured data For PhEmit()
PhRawPtrEvent_t PhotonRawPtrEvent;
PhRawPtrCoord_t *ptrCoord;
PhotonRawPtrEvent.msec = 0;
PhotonRawPtrEvent.button_state = 0;
PhotonRawPtrEvent.flags = (char)0;
PhotonRawPtrEvent.raw_flags = (char)0;
PhotonRawPtrEvent.zero = 0;

//BOTTOM LEFT Co-ords
PhotonRawPtrEvent.num_coord = 1;
ptrCoord = PhotonRawPtrEvent.coord;
ptrCoord->x = 25;
ptrCoord->y = 755;
ptrCoord->z = 0;
ptrCoord->dmsec = 0;

//PhEmit() Part
event.type = Ph_EV_RAW;
event.subtype = Ph_EV_RAW_PTR;
event.num_rects = 1;
event.data_len = 20;
event.emitter.rid = PRid;
event.collector.rid = 0;
event.collector.rid = Ph_ROOT_RID;
rect.ul.x = rect.ul.y = 0;
rect.lr.x = rect.lr.y = 0;

PhEmit( &event, &rect, (void *) &PhotonRawPtrEvent);

can anyone help me find what i’m missing? How will the event emmitted in the region i’ve created can be sent to the region below?

Hi guys,

I know that this comes long after you had this conversation about ELO touchscreens, but I hope you are still active, and
could find some time to help me.

I am trying to establish basic touch screen function, where touching the screen would be equivalent to left-clicking
same point by mouse. Currently I am not concentrating on return codes, so I have not developed any application, I am
only typing commands in pterm.
I am working with 1024x768 resolution on ET1515L-8CEC-1-GY-G model with serial communication. Version of operating
system is 6.3.0 SP3, although I believe that nothing would change on 6.3.2.
I typed:

calib

Then I left-clicked four times on target and finally on button where text “Press to complete Calibration” was displayed.

This operation created file /etc/system/trap/calib.Miki (Miki is hostname on my computer).
Then I typed:

devi-elo smartset fd -d/dev/ser1

when following error was reported:
Error: found graphics region with no capabilities data
but apparently some touchscreen function was initiated.
Problem is that only small bottom part of the screen has touchscreen function, y-coordinate is upside down, and x-
coordinate has wrong scaling (value of x-coordinate of the cursor is larger than value of x-coordinate of the touch
point , multiplied by around four), and there is x-coordinate offset (when most left point on the screen is touched,
cursor is moved to right by more than 50 pixels).

Any thoughts?

Best regards,
Miki

Hi guys,

I know that this comes long after you had this conversation about ELO touchscreens, but I hope you are still active, and
could find some time to help me.

I am trying to establish basic touch screen function, where touching the screen would be equivalent to left-clicking
same point by mouse. Currently I am not concentrating on return codes, so I have not developed any application, I am
only typing commands in pterm.
I am working with 1024x768 resolution on ET1515L-8CEC-1-GY-G model with serial communication. Version of operating
system is 6.3.0 SP3, although I believe that nothing would change on 6.3.2.
I typed:

calib

Then I left-clicked four times on target and finally on button where text “Press to complete Calibration” was displayed.

This operation created file /etc/system/trap/calib.Miki (Miki is hostname on my computer).
Then I typed:

devi-elo smartset fd -d/dev/ser1

when following error was reported:
Error: found graphics region with no capabilities data
but apparently some touchscreen function was initiated.
Problem is that only small bottom part of the screen has touchscreen function, y-coordinate is upside down, and x-
coordinate has wrong scaling (value of x-coordinate of the cursor is larger than value of x-coordinate of the touch
point , multiplied by around four), and there is x-coordinate offset (when most left point on the screen is touched,
cursor is moved to right by more than 50 pixels).

Any thoughts?

Best regards,
Miki

I’m a little confused on your calibration procedure. I would expect instead the following.

  1. Start the driver
  2. Run the calibration program

You used the term “click” when describing how you used the calibration program. That sounds like you are using the mouse. This would not work at all.

The whole point of the calibration process is to use the driver and hardware to read in data from where you press and find the coefficients to transform your presses to real places on the screen.

I hope this helps.