Hi everyone,
I'm working on solving an issue I'm having with Google Glass, which is, when connecting a mouse I get a cursor but clicking doesn't seem to do anything. Which I half expect using a "glassapp"; but I have a variety of legacy apps I have side-loaded that I know work with a mouse but just don't on glass.
Through adb I can see that a new mouse is created in /dev/input/ and when I do getevent I can see mouse events firing. These include mouse movements, as well as button down and button up events. It looks like all the driver stuff is set up correctly and the mouse is working fine.
What I believe is the issue is that there is a disconnect between the event and the event listener in the actual apps. So what I am curious to know is, how do event listeners actually get triggered by a system event?
My guess is that the app is looking for /system/lib/libinput.so to register the mouse event but the lib loaded on glass doesn't have methods for mouse inputs. In which case if I updated this library I could potentially get this functionality back? If this is true how can I go about decompiling this library to add this functionality?
My understanding of apks and unix system events is very limited and this is just my current hunch. If anyone who knows more could lend me some knowledge or point me in a better direction I would appreciate it.
XDA Visitor said:
Hi everyone,
I'm working on solving an issue I'm having with Google Glass, which is, when connecting a mouse I get a cursor but clicking doesn't seem to do anything. Which I half expect using a "glassapp"; but I have a variety of legacy apps I have side-loaded that I know work with a mouse but just don't on glass.
Through adb I can see that a new mouse is created in /dev/input/ and when I do getevent I can see mouse events firing. These include mouse movements, as well as button down and button up events. It looks like all the driver stuff is set up correctly and the mouse is working fine.
What I believe is the issue is that there is a disconnect between the event and the event listener in the actual apps. So what I am curious to know is, how do event listeners actually get triggered by a system event?
My guess is that the app is looking for /system/lib/libinput.so to register the mouse event but the lib loaded on glass doesn't have methods for mouse inputs. In which case if I updated this library I could potentially get this functionality back? If this is true how can I go about decompiling this library to add this functionality?
My understanding of apks and unix system events is very limited and this is just my current hunch. If anyone who knows more could lend me some knowledge or point me in a better direction I would appreciate it.
Click to expand...
Click to collapse
Greetings,
Thank you for using XDA Assist.
Please create an XDA account and ask your question here:
Questions and Answers
You'll receive expert advice there. Good luck and welcome to XDA!
Related
i mean, is that possible to develop a SW for ppc,
Which shows a mouse in ppcs screen, and we can use navigator key control the mouse.
That's quite easy, but would there be much point?
The program you're using would then loose use of the control pad.
An idea tho would be, say:
Record button toggles mouse (shows/hides cursor)
Joystick/pad controls mouse cursor
Contacts button sends left click
Calendar button sends context menu click.
It could work. I might play with this sometime. :wink:
V
there is a soft for virtual mouse
if u have a ppc with usb host function then u can connect ur computer's mouse to the ppc using the usb connection there is a software available for the mouse pointer here's the link
http://www.deje.gmxhome.de/download.html
u can make it someting special.
for exmaple.
in SW setting we can choose by which programme activating the mouse.
Re: there is a soft for virtual mouse
dhiraj228 said:
if u have a ppc with usb host function then u can connect ur computer's mouse to the ppc using the usb connection there is a software available for the mouse pointer here's the link
http://www.deje.gmxhome.de/download.html
Click to expand...
Click to collapse
yes i know about this.
but a virtual mouse does not mean only a tool to control the ppcpe.
it can be very helpful for one hand ppcpe operating.
sometimes we have only one free hand to use ppcpe.
Hi vijay or other developer
Any development/thought on virtual mouse - I like the idea of non stylus control of my wizard esp when in case. You could assign a button to switch on and off. You could call it VJmouse or VJrodent I would pay for this - I know that's music to your ears
Cheers
generating mouse clicks is easy enough but the drawing of the cursor in different locations would be the harder part. I have not seen cursors drawn in places other than the centre. How could that be done without drawing to the screen buffer?
One possible idea is a transparent window, say a 16 by 16. Any way I just wanted to mention that HP iPaq 4700 has a native app in its ROM that shows a mouse cursor and lets the arrow key area act as a touch pad just like on a laptop.
Guys, this is by the guy that did the FileDialogChanger, a genius in his own time:
http://www.geocities.co.jp/SiliconValley-Cupertino/2039/FakeCursorPPC.zip
Please indicate if it does or doesn't work. I don't think it would be too difficult to write a new mouse emulator, but there's no point reinventing the mouse wheel :wink:
V
Thanks vijay that's wicked
Finally true one handed operation of my pda - es
So I want have to pay you all my bucks after all!
I have tried this app and it is really good but has a couple of bugs. one is that it won't delete or backspace on the virtual keyboard, second it call and end call buttons and softkeys are immobilized when the fake cursor is operational. If it could be developed further and possibly include other pointer styles it would be an excellent app - I think marketable and some thing that should be part of the OS.
Smartphone users will be particularly interested as I am being a long trrm spv user.
cheers
wicked man... I like it
kudoos to the developer
vijay555 said:
That's quite easy, but would there be much point?
The program you're using would then loose use of the control pad.
An idea tho would be, say:
Record button toggles mouse (shows/hides cursor)
Joystick/pad controls mouse cursor
Contacts button sends left click
Calendar button sends context menu click.
It could work. I might play with this sometime. :wink:
V
Click to expand...
Click to collapse
I'd be interested in a "mouse" or "Cursor control" app.
IMHO the one thing missing from the Universal is a "clit-mouse" (the tiny joystick type thing you used to get on Laptops). I dont' think it would have been that hard to implement physically either.
vijay555 said:
Guys, this is by the guy that did the FileDialogChanger, a genius in his own time:
http://www.geocities.co.jp/SiliconValley-Cupertino/2039/FakeCursorPPC.zip
Please indicate if it does or doesn't work. I don't think it would be too difficult to write a new mouse emulator, but there's no point reinventing the mouse wheel :wink:
V
Click to expand...
Click to collapse
can someone describe exactly what this does?
Does it slow the device down at all?
(and has anyone translated the txt file in that zip???? )
Yes works qute will as a cursor / mouse you can control with d-pad. no does not slow device down and no can't translate txt but is quite self explanatory when you use. Give it a go tis cool!
Find attached for other users in case link goes down
I have added mouse cursor just place in same directory as fake cursor.exe and you will have typical mouse pointer.
meschle said:
Yes works qute will as a cursor / mouse you can control with d-pad. no does not slow device down and no can't translate txt but is quite self explanatory when you use. Give it a go tis cool!
Find attached for other users in case link goes down
Click to expand...
Click to collapse
I assume the .exe in the zip is for installing over Activesync? Or is it run on the PPC itself?
Run on ppc itself - i stuck it in \program files\
custom cursor
If it could be developed further and possibly include other pointer styles it would be an excellent app - I think marketable and some thing that should be part of the OS.
Click to expand...
Click to collapse
Just put any 16 color win95 cursor file in the same directory as FakeCursor.
a program called GoldIcon creates cursors that are compatible with FakeCursor.
I love this app, I have a top soft key dedicated to this so I can activate/deactivate it at any time.
there is no installation; the EXE can be located anywhere on the device and creates 2 lnk files in windows/startmenu/programs when run:
FakeCursor.lnk (toggle on and off) and FakeCursor Settings.lnk
Doesn't seem to be anything FAKE about it. It's a real Dpad mouse!
Thanks for the tip - do you have any you could post particularly mouse style pointer.
thanks
can you use XP mouse pointers, I havent seen a win95 machine for years.... or if you have one around would be great if you could simply post a few.
Hi I succeed to finish my code of simple viewer PC screen. Work with WP7
See the video
http://www.youtube.com/watch?v=cCwsuj7Hcno&list=HL1330329890&feature=mh_lolz
Now I will go to try how control the mouse?
About my app I use socket to connect to the server, no RDP protool and its word good. I try it on different machines and networks (@IP) and work perfect.
My app is a last year project so any idea about how to control the cursor.
This probably should have been in the same thread, but anyhow...
It depends on how your protocol is designed. Currently, all the data is pushed from the PC to the phone. To control the cursor, your phone needs to push data to the PC, which means that the PC server app needs to check for data from the phone app.
Sensing a touch on the phone is easy. Sending those coordinates over the network is pretty easy as well; you figured out sending video so I assume you can handle this part. On the PC, you'll want to receive those coordinates and then multiply them back out to their equivalent positions at the PC's resolution. Then you need to move the mouse cursor. There are Windows APIs for doing this, but I've never messed with them. They might be exposed to .NET, but I'm guessing that you'll need to make a call into native code (this is pretty easy, though; look up P/Invoke). Clicking can be implemented by having the phone detect that you tapped the same spot where the cursor already is, and having it send a different message to the server.
The messages will need to be determined by you, as the designer of the protocol. I do recommend using different messages for "move the cursor to here" and "click where the cursor is now." As for when to send the message, that also depends on how you implemented the app. If the entire transmission of the screen frame is one long socket send/receive, you'll have to exchange cursor commands between frames. If you do multiple smaller chunks, you can check for a cursom command and update the position or click as appropriate. Another alternative is to create two socket connections, and have the second one be used for cusor commands. I don't recommend this, though - it's not needed, it takes more code, and although I feel that everybody *should* learn multi-threaded netcode development, I'm not sure it what you want to work on now.
I am really confused
But I will try to get the coordinate of cursor from PC and move it on WP7 screen at first, if I succeed I will develop code tap (click), double tap and drag.
I am really confused
No one can help!!!!!!!!!!!!
Google should be your best helper - try to work with google first.
On WP7 app side you need to grab tap position, translate it to desktop coordinates and send data to the server. On the server side you may use Win32 API function SendInput() http://msdn.microsoft.com/en-us/library/windows/desktop/ms646310(v=vs.85).aspx
SendInput can emulate mouse and keyboard events.
P.S. As for youtube video: try your project on the real device, using WiFi or 3G, and you will understand what RDP protocol was built for
Yes I use Google for that,its the best search engine
I understand that use the coordinates to locate the position, and send it from WP7 to PC.
But my question is how any sample code can hep?
Guys, the app idea is great, have you considered adding remote control of the touchscreen? it is possible through adb by input tap, input swipe + input text commands.
If you would combine that with mouse tracking, coordinate capturing in the client, you'd have full blown device remote
proof of concept is done in VB:see http://forum.xda-developers.com/showthread.php?t=2786395
ogonzuhnina said:
Guys, the app idea is great, have you considered adding remote control of the touchscreen? it is possible through adb by input tap, input swipe + input text commands.
If you would combine that with mouse tracking, coordinate capturing in the client, you'd have full blown device remote
proof of concept is done in VB:see http://forum.xda-developers.com/showthread.php?t=2786395
Click to expand...
Click to collapse
Thank you for your suggestion, we'll consider adding it in future version. Actually we've receive some similar suggestions.
Personally, I think this feature is cool but not useful to me. I may use it when the screen is broken or the phone is not nearby. But since I can use AirDroid to send SMS, make calls, why would I use this for? I can do other things like tweet, watch video on pc directly.
Well, maybe I will found it useful when I broke my touchscreen.
I've moved your post to this new thread.
Hint: we are developing VNC feature in new version. Hope you like that.
Hi all,
Few days ago I'v repaired HTC Desire Z with Russian keyboard and installed CM10.2 onto it. And of course I want to have working Russian keyboard layout.
Unfortunately only official firmware support it, and problem cannot be solved by editing layout tables because for some buttons there is two Cyrillic letters on one button. There is ruKeyboard application to fix it, but it closed source, so it doesn't acceptable for me.
So, I'm going to develop my application for it (open source, of course) and want to ask some questions... I have a lot of development experience (especially low level, e.g. drivers, MCU's firmware and so on), but never programmed for Android (however I know Java to some degree).
Of course, I can patch android kernel/sources to get my task done, but I suppose that it's a bad idea, because I'll need to port changes to new versions and so on. So, I want to process keyboard events from userspace.
My question: Is it possible to hook all hardware keyboard events (i.e. scancodes, not characters) from userspace, remove them from message queue, and produce new events? I know that producing new events is possible, but what about hooking it (like MS Windows event hooks?). Can you give me a hint (maybe link to example or API, suitable for it)?
Thank you in advance.
FossaFX said:
My question: Is it possible to hook all hardware keyboard events (i.e. scancodes, not characters) from userspace, remove them from message queue, and produce new events? I know that producing new events is possible, but what about hooking it (like MS Windows event hooks?). Can you give me a hint (maybe link to example or API, suitable for it)?
Click to expand...
Click to collapse
I am not a programmer, but I would be interested in an app that could substitute key press events (on my Motorola Droid 4).
Have you heard of the Xposed framework? Maybe this would be a possibility to achieve your goal and Xposed might make things much easier for you.
daniel_m said:
I am not a programmer, but I would be interested in an app that could substitute key press events (on my Motorola Droid 4).
Have you heard of the Xposed framework? Maybe this would be a possibility to achieve your goal and Xposed might make things much easier for you.
Click to expand...
Click to collapse
Thank you, I'll read about it (and no, I didn't hear about it, I have never programmed for Android).
daniel_m said:
I am not a programmer, but I would be interested in an app that could substitute key press events (on my Motorola Droid 4).
Have you heard of the Xposed framework? Maybe this would be a possibility to achieve your goal and Xposed might make things much easier for you.
Click to expand...
Click to collapse
http://www.howtogeek.com/195476/7-t...ramework-on-a-rooted-android-phone-or-tablet/ looks like good thing. So I'll investigate sources of Xposed to look how do they did it.
Good luck!
Would be wonderful for yet another useful Xposed module to see the light of day
Greetings all,
I am working on an AccessibilityService that takes input from game controllers (PS5 controller, Xbox Controller, etc.), I am using OnKeyEvent() for handling button presses and releases ,however I am having a lot of trouble figuring out how to receive input from the Joysticks.
Generally I would use OnGenericMotionEvent() to handle MotionEvents from the joysticks, but since this is an AccessibilityService and not an Activity, it doesn't seem to have that method.
The MotionEvents I want to handle are as follows: AXIS_X, AXIS_Y, AXIS_Y, AXIS_RZ, AXIS_RY, AXIS_RX, AXIS_HAT_X, AXIS_HAT_Y, AXIS_LTRIGGER, AXIS_RTRIGGER, AXIS_BRAKE, AXIS_GAS
Does anyone here know how I could handle MotionEvents from an AccessibilityService? I've looked at the official docs and codelabs a lot but this has me stumped.
Regards,
0xB01b