Hi there, this is my first post on xda so forgive me if this is put in the wrong forum.
I am new to android and wish to play around with the emulator.
What I want to do is to create my own piece of virtual hardware that can collect OpenGL commands and produce OpenGL graphics.
I have been told that in order to do this I will need to write a linux kernal driver to enable communication with the hardware. Additionally, I will need to write an Android user space library to call the kernal driver.
To start with I plan on making a very simple piece of hardware that only does, say 1 or 2, commands.
Has anyone here done something like this? If so, do you have any tips or possible links to extra information?
Any feedback would be appreciated.
Regards
Has anyone done this?
Bump Bump
Im surprised this is left unanswered.
Yes, you can do this. There are several ways to do so, but I will explain 2 good options for you.
One: Use AndroidSDK. It's configured for android and simple to set up.
Two: Use virtualbox. If you have the android ISO, you can install it as a bootable image in this software. This gives more functionality than AndroidSDK, but it is not as simplistic. There are settings you will need to adjust to get it running. For a working Android ISO with limited functionality, you can download this: http://www.android-x86.org/download
This should be in development forum, but I cannot start thread there. Moderators, please move if possible.
I've made OpenGL live wallpaper (really port of my windows program developed long ago). It works fine (but extremely slowly, I guess due to lack of real hardware acceleration) in emulator and in my own device (garmin-asus A10) but I have a lot of feedback that there have problems on other devices. The problems reported really mean that textures are not drawn (primitives drawn without texturing). Everything work except textures (and they are the most important part).
I have no possibility to debug on another device by myself. So I call other developers for help.
The code is fully open here: code.google.com/p/android-deep-wallpaper/
Application link in market: market.android.com/details?id=com.digitalinfinity.deep
I want everyone to test on devices and possibly try to debug code. The code is very simple so I really have no idea where problem could be.
PS: If someone would like to further maintain this project just write to me.
Hi all,
I'm working on a personal project and am wondering what are some possible methods I can use to communicate with an IR sensor over UART. My sensor has it's own chip and requires some configuration and I have it working in an Ubuntu environment perfectly fine through a user space application.
Let us assume I successfully modify the hardware to plug my sensor into my Android device. I'm not sure what's next. I know getting the information from the sensor and into an app will be much easier than getting it to control the entire Android UI, so I will probably look into that first.
I took a look into inputattach but am not certain if that can work with my sensor and couldn't find all that much documentation on inputattach. I thought maybe I could do something with the HAL but once again, I found very little documentation on the Android HAL and couldn't learn very much.
Does anyone have a good place to start or just any suggestions? I'm pretty lost here.
Thanks!
Bumping for answers. Thanks!
Right now, here's what I have:
-Padtie, an open source "gamepad->keyboard" program, which runs on .NET 4, and hopefully should run on WinRT. I am going to test it after work today. It is open source as well, so maybe we can port if it doesn't "just run".
-DirectInput / dinput is not in WinRT, so I'll be looking into adding it. We won't be able to add direct support for the gamepads in windows store apps (like the emulators) because windows store apps can only support xinput (AKA xbox controllers). This is where Padtie will hopefully work it's magic, by making the emulator/game think that a keyboard is being used.
To the more saavy devs: IF Padtie works on RT, but dinput is needed, could I drop the dinput dll's into the same directory as the exe? What are your thoughts on doing so?
If people are interested in this task, please reply so I know how much pressure is on me to finish it . Well, besides the pressure I put on myself because I really want it.
Not exactly related, but still. I connected some gamepad (speedlink xeox) into my XPS10, it was recognized (it's visible in devices and printers), but when I right click it and select option to open gamepad control panel, nothing happens and even it has typical unknown file icon (seems that cpl is missing at all). Is it just a problem with my OS, or result of removing part of gamepad support by MS?
About DirectInput - there's no DI at all, or it is, but just metro apps can't use it? If it is somewhere, this should help:
https://code.google.com/p/x360ce/
This would be amazing as the PS4 controller pairs to the Surface over Bluetooth and is recognized as a Wireless controller however no game seems support it.
Very excited I found this thread, I was just about to create one on the very same topic!
Erisii,
I understand the desire to map to keyboard inputs, and it may even work for most things. However, I don't see how it would handle analog inputs, ie. joysticks.
I do not know enough about the driver side of things, but is it possible for a program to convert the generic input of a controller to the xinput you speak of?
Also, I have some coding experience, but am far from a pro. Let me know if I can help in any way, I have a surface RT running 8.1.
try rawinput
most hid joystick support rawinput
sample:http://www.codeproject.com/Articles/185522/Using-the-Raw-Input-API-to-Process-Joystick-Input
windowsrtc said:
try rawinput
most hid joystick support rawinput
sample:http://www.codeproject.com/Articles/185522/Using-the-Raw-Input-API-to-Process-Joystick-Input
Click to expand...
Click to collapse
I apologize for being a newb. I will look into this a bit, but I assume we would need to write some application that utilizes that? Or are you saying we should port it?
Hello, i was recently made aware that my usb 360 panoramic camera module (Lolly360) does not work in android 10 due to a bug in the source of Android 10 that lacks the proper permissions for devices that use USB video device class (UVC) api's. I'm not 100% how things work, but I think an api in the SDK28 (USBCamera-target-API-28.apk) is bugged but maybe has been fixed in the latest Android 10 source. Unfortunately, im not sure if it exists in OxygenOS 10.3.3.
Is anyone aware of this? And/or does anyone know where i could begin to tackle this issue?
I posted in the OnePlus community forums, but i feel like it may have fallen on deaf eyes
Sources for my information:
https://github.com/saki4510t/UVCPermissionTest/issues/1
https://issuetracker.google.com/issues/139087809
https://developer.android.com/about/versions/10/privacy/changes
https://www.camerafi.com/notice-android-10-devices-do-not-support-usb-camera-connection/