intelligent handling of touchscreen input ? - General Questions and Answers

hello! my android 7 device may be called a smartphone but it is incredibly dumb when it comes to interpreting the input that comes to it from the screen when I touch it! I am just wondering if there are any apps or developer projects that seek to address this? no doubt it may be partly a hardware issue, maybe some devices have better sensitivity or let you tweak more settings than just the mysterious 'pointer speed'? but beyond this, are there any attempts to augment whatever module translates physical screen data into logical clicks, drags and so on? for instance, through active or passive training to distinguish drags from clicks, or by examining the screen pixels near where I click so a click in the vicinity of a text label is considered as an attempt to press the label, rather than assuming that I am pressing on an empty area of the background for no apparent reason? thanks in advance for any ideas! — Joseph

Related

Acceleration of the real movement

Hi, I'm trying to make use of the sensors built in the Android phone to detect the acceleration of the person carrying the phone. However, the problem is that the sensors are too sensitive for even a small motion, say tilting of the phone. But what I want is, no matter where and how (eg. the orientation of the phone) to put the phone on the body, the phone is always able to show me the accurate acceleration of the person...
So does anybody have any idea??? Thank you very much!
PS. I managed to cancel out the G force on x, y, z axis, and then theoretically the only acceleration left should be the one of the movement. But it didnt work that way, because there were still acceleration fluctuations on x,y,z axis whenever I touch the phone...
I don't get it.
So, you want to measure acceleration; and the problem is that the phone measures the acceleration?
about the "too sensitive" part. Is that really a problem? You can just get averages (for every second, every minute,...), max acceleration,...
also, cancel out G force? I haven't used the Dream accelerometer, but I guess it gives you separated values for static or dynamic forces, isn't it?
Wouldn't you just need to do the math to figure out 0? I haven't tested any of this and this isn't good math but lets pretend you see the motion on x as +6 and on Y it's -2. Then you would know it moved +4 for whatever duration... Does that make sense or am I way off?
paxku said:
I don't get it.
So, you want to measure acceleration; and the problem is that the phone measures the acceleration?
about the "too sensitive" part. Is that really a problem? You can just get averages (for every second, every minute,...), max acceleration,...
also, cancel out G force? I haven't used the Dream accelerometer, but I guess it gives you separated values for static or dynamic forces, isn't it?
Click to expand...
Click to collapse
Hi, paxku, Thank you. perhaps, let me give you an example. Say, I carry the phone and move in x axis direction. Now if the phone does not have any other forces on it except for the one moving it, theorectically the phone (the accelerometer in it) will show the reading of the moving acceleration. However, this is never the case, because when one carries the phone, the phone is inevitably shaken and tilted from time to time (think about when it's put in the pocket), which gives the phone many "interfering" forces. Thus, the reading will eventually show the sum of these accelerations, rather than only the moving one I want... So is there any way to get ride of these interference and only leave the moving acceleration.
"too sensitive", I actually meant "too sensitive to interfering forces". I tried to set a threshold to filter out the interference. But the problem is, a lot of time, the moving force is actually much smaller than the interfering forces, for example, when i carry the phone moving very very slowly. This ends up that the moving one gets filtered...
For the G force thing, I don't think my phone can give separate values. Does it really depend on the model of the phone?? I guess all Android phones have a same way of dealing with this. Check out this from Android Web.
Sensor.TYPE_ACCELEROMETER:
All values are in SI units (m/s^2) and measure the acceleration applied to the phone minus the force of gravity.
values[0]: Acceleration minus Gx on the x-axis
values[1]: Acceleration minus Gy on the y-axis
values[2]: Acceleration minus Gz on the z-axis
Thank you very much!!!
youneek said:
Wouldn't you just need to do the math to figure out 0? I haven't tested any of this and this isn't good math but lets pretend you see the motion on x as +6 and on Y it's -2. Then you would know it moved +4 for whatever duration... Does that make sense or am I way off?
Click to expand...
Click to collapse
Er..I guess the math that we are talking about is different. Actually, i don't need to calculate the resultant force;it's fine to just leave it decomposed on x,y,z axis. The math I did was that I decomposed G force to x,y,z axis, and then subtract Gx,Gy,Gz from the decomposed external force, in order to get ride of G force interference. The reason for doing this, you can refer to my reply above where I quoted from Android web how the acceleration readings are generated.
Would appreciate it if you have any idea....
How to get acceleration and orientation readings at the same time?
Hi, it seems that on Android acceleration readings and orientation readings are both from values[0], values[1], values[2], depending on which sensor type is being monitored. So if I want to get the acceleration and orientation data at the same point of time, what do I do?
I'm not sure if I'm doing this correctly,
public void onSensorChanged (int sensor, float[] values){synchronized (this) {
switch (sensor){
case SENSOR_ORIENTATION:
pitch = values[1];
roll = values[2];
break;
case SENSOR_ACCELEROMETER:
r_ax = values[0];
r_ay = values[1];
r_az = values[2];
break;
}
But I think this only gives me the values of the sensor that changes at the time. what if both sensors change at the same time???
Thank you!!!
How to get acceleration and orientation at the same time?
Hi..sorry to post the same thing again, cuz I really have to know this...
1. How to get acceleration and orientation data at the same time? I'm using onSensorChanged( ) , but this method only returns either acceleration or orientation at a time... I'm thinking, can I use getOrientation( ) together with onSensorChanged() to get the two at same time? If possible, how?
2. I want to know, is orientation actually derived from acceleration, in stead of obtained independently? If so, does anybody know how the orientation is computed from the acceleration?
Thank you very muchhhhhhhhhhhhh!!!!!!!
Stop opening new threads for the same topic....last warning
The major issue is that the google phone doesnt detect rotation.
Yeah you can get XYZ acceleration values, but part of that "value" is from roatation and the other part is from actual linear acceleration.
The first thing we need to figure out is What exactly are you trying to measure? If you're trying to measure movement using accelerometers, you may be disappointed.
Remember that accelerometers measure changes in speed, not changes in position. So, let's say I'm walking across a room at a constant speed and start taking measurements - I'll see accelerations up and down due to the fact that I keep changing from moving up to moving down and vice-versa, likewise for left to right measurements, even front to back as our gait is not perfectly smooth. However, over time, these accelerations will tend to cancel out, yet, I'm still moving.
The only way you're going to be able to calculate movement with accelerometers is to start with a known velocity and then calculate subsequent velocities and displacements from there on using accelerometer measurements (known in the navigation world as dead-reckoning).
Long story short: Movement in a straight line at a constant velocity has no acceleration (or force) component and, thus, cannot be detected or measured by an accelerometer.
fyp said:
Hi, I'm trying to make use of the sensors built in the Android phone to detect the acceleration of the person carrying the phone. However, the problem is that the sensors are too sensitive for even a small motion, say tilting of the phone. But what I want is, no matter where and how (eg. the orientation of the phone) to put the phone on the body, the phone is always able to show me the accurate acceleration of the person...
So does anybody have any idea??? Thank you very much!
PS. I managed to cancel out the G force on x, y, z axis, and then theoretically the only acceleration left should be the one of the movement. But it didnt work that way, because there were still acceleration fluctuations on x,y,z axis whenever I touch the phone...
Click to expand...
Click to collapse
Doesn't this whole idea go against the very basic laws of physics? You can't possibly tell the difference between the force of gravity and the force of the person moving forward. The best you could possibly do is try to figure out how to wait for the person to put the phone however they're going to have it and make absolutely sure, somehow, that it doesn't move at all and use that as a 0 point. Basic point is it's physically impossible to tell any 2 forces apart. I'm fairly sure that's part of Einstein's theory of general relativity(though it's been a while).
[email protected] said:
Doesn't this whole idea go against the very basic laws of physics? You can't possibly tell the difference between the force of gravity and the force of the person moving forward. The best you could possibly do is try to figure out how to wait for the person to put the phone however they're going to have it and make absolutely sure, somehow, that it doesn't move at all and use that as a 0 point. Basic point is it's physically impossible to tell any 2 forces apart. I'm fairly sure that's part of Einstein's theory of general relativity(though it's been a while).
Click to expand...
Click to collapse
Thank you! Yes, you are absolutely right. That's why I tried to eliminate G first, because Android defines 0,0,g as the acceleration on x,y,z when the phone lies flat on a surface. So if the phone is moved or rotated, g is decomposed automatically onto x,y,z, which I have to cancel them out. Now theoretically G could be eliminated by figuring out the values of decomposed G on x,y,z and then doing the reverse computation so that the only acceleration shown is of the movement. But a new problem is, to work out the values of decomposed G on x,y,z, I need to get the orientation of the phone, specifically, pitch and roll. But it seems that Android cannot report the acceleration values and orientation values at the same time instance, which causes that the calculation of decomposed G on x,y,z uses a wrong set of pitch and roll. Also, looks like orientation readings are actually derived from acceleration...which may also cause the wrong calculation of decomposed G.
Here is the fomulas I'm using to decompose G, and then to eliminate G on x,y,z
gx = (float) (GRAVITY_EARTH * Math.sin(roll*DEGREE_CONV));
gy = (float) (GRAVITY_EARTH * Math.sin(pitch*DEGREE_CONV));
gz = (float) (GRAVITY_EARTH * (Math.cos(pitch*DEGREE_CONV)*Math.cos(roll*DEGREE_CONV)));
ax = r_ax - gx;
ay = r_ay - gy;
az = r_az + gz;
where DEGREE_CONV is to convert degree to radian, r_ax,r_ay,r_az are the acceleration values directly from SensorListener(), ax,ay,az are the net accelerations which are supposed to be 0 when the phone is stationary.
Am I doing this correctly? I'm actually wondering, besides the problem that roll and pitch are not obtained at the same time as r_ax,r_ay,r_az are, the phone itself may not decompose G using exactly my formulas. So I have to find out how the phone does the decomposition in order for me to do the reverse process.
PS. honestly, Android keeping G by default is really annoying...
sjbayer3 said:
The major issue is that the google phone doesnt detect rotation.
Yeah you can get XYZ acceleration values, but part of that "value" is from roatation and the other part is from actual linear acceleration.
Click to expand...
Click to collapse
Thank you. Are you saying that the orientation/rotation values are not detected, but computed from acceleration values. This is just what i'm thinking too. That's why SensorListener() always reports new acceleration values first and after a short time, new orientations. and also, even sometimes when the orientation doesn't really change, just because there's a big change in acceleration, the orientation changes too. (for example, if you suddenly move the phone very fast horizontally, the screen orientation may change.)
So is there any way to get the acceleration and its corresponding orientation(pitch and roll) at the same point in time?
Sicarius128 said:
The first thing we need to figure out is What exactly are you trying to measure? If you're trying to measure movement using accelerometers, you may be disappointed.
Remember that accelerometers measure changes in speed, not changes in position. So, let's say I'm walking across a room at a constant speed and start taking measurements - I'll see accelerations up and down due to the fact that I keep changing from moving up to moving down and vice-versa, likewise for left to right measurements, even front to back as our gait is not perfectly smooth. However, over time, these accelerations will tend to cancel out, yet, I'm still moving.
The only way you're going to be able to calculate movement with accelerometers is to start with a known velocity and then calculate subsequent velocities and displacements from there on using accelerometer measurements (known in the navigation world as dead-reckoning).
Long story short: Movement in a straight line at a constant velocity has no acceleration (or force) component and, thus, cannot be detected or measured by an accelerometer.
Click to expand...
Click to collapse
Thank you for your answer. I'm aware of the accumulative error involved in dead reckoning. But now the error is acceptable to me, as long as the displacement that is integrated twice from acceleration shows me general direction of the phone's movement. The phone will always start calculation from stationary status, meaning the velocity is zero. For the constant speed moving, I think, theoretically, the displacement can still be computed, because the travelling time and velocity, which is calculated from the previous acceleration process, are known. Although for this case no acceleration is detected, it is just simply time*velocity, giving the displacement.
fyp said:
Thank you! Yes, you are absolutely right. That's why I tried to eliminate G first, because Android defines 0,0,g as the acceleration on x,y,z when the phone lies flat on a surface. So if the phone is moved or rotated, g is decomposed automatically onto x,y,z, which I have to cancel them out. Now theoretically G could be eliminated by figuring out the values of decomposed G on x,y,z and then doing the reverse computation so that the only acceleration shown is of the movement. But a new problem is, to work out the values of decomposed G on x,y,z, I need to get the orientation of the phone, specifically, pitch and roll. But it seems that Android cannot report the acceleration values and orientation values at the same time instance, which causes that the calculation of decomposed G on x,y,z uses a wrong set of pitch and roll. Also, looks like orientation readings are actually derived from acceleration...which may also cause the wrong calculation of decomposed G.
Here is the fomulas I'm using to decompose G, and then to eliminate G on x,y,z
gx = (float) (GRAVITY_EARTH * Math.sin(roll*DEGREE_CONV));
gy = (float) (GRAVITY_EARTH * Math.sin(pitch*DEGREE_CONV));
gz = (float) (GRAVITY_EARTH * (Math.cos(pitch*DEGREE_CONV)*Math.cos(roll*DEGREE_CONV)));
ax = r_ax - gx;
ay = r_ay - gy;
az = r_az + gz;
where DEGREE_CONV is to convert degree to radian, r_ax,r_ay,r_az are the acceleration values directly from SensorListener(), ax,ay,az are the net accelerations which are supposed to be 0 when the phone is stationary.
Am I doing this correctly? I'm actually wondering, besides the problem that roll and pitch are not obtained at the same time as r_ax,r_ay,r_az are, the phone itself may not decompose G using exactly my formulas. So I have to find out how the phone does the decomposition in order for me to do the reverse process.
PS. honestly, Android keeping G by default is really annoying...
Click to expand...
Click to collapse
Hi,
Your gx and gy is almost right, but gz is wrong.
Actually, gz is quite easy to calculate if we already get gx and gy. Just keep the following equation in mind:
gx^2 + gy^2 + gz^2 = G^2
Thus,
abs(gz) = sqrt(G^2 - gx^2 - gy^2)
Be careful about the sign of gx, gy, and gz. gz's sign can be determined by pitch value, if abs(pitch) > 90, gz should be negatie; otherwise, it is positive. BTW, I think gy's sign is wrong in your equation.

[Q] AMOLED On-Screen Notification?

It is my understanding that AMOLED screens consume no power for pixels that are set to black. It is also my understanding that phones are being released without a notification light on them (such as the Nexus S). It is also my understanding that Android phones expect to spend some time in the Idle state and do various update tasks, reduce overall power consumption etc. while in such a state so not allowing them to ever idle would not be a solution for any problem.
Assuming my above understanding is correct, how difficult would it be to make a change to Android such that notification information could be displayed on-screen in a few pixels (say, in the center of the screen) so that the majority of the screen is consuming no power (black pixels) but you still have a few colored pixels for notifications. These pixels could be set to black when there are no notifications so that the small power they do consume is only used when a notification needs attention.
A step beyond this would be to create an interface that would allow other developers to create various themed idle screens that would be customizable by the user, or at least by a variety of developers. This may be as simple as changing the colors, rate of flashing (if any), number of notification pixels used, etc. Also, different apps could notify in different parts of the screen to give further feedback as to what needs attention without having to touch your phone (just a quick glance).
My background is that of a C++ Software developer who has dabbled a bit in Android, writing various small apps for it. I have never worked with the OS itself before so I am uncertain if this would be a trivial task or a very complex one. Perhaps it is even something that could be done without any OS changes (though to my understanding this is not the case).
Check this thread/app: http://forum.xda-developers.com/showthread.php?t=730692
Sent from my GT-I9000 using XDA App

[Q] Query about smartphone's accelerometer

Hi,
Is it possible to detect/measure car's acceleration or deceleration through a smartphone accelerometer ?
If yes, is it possible in all new smartphones ?
and how to implement it in android ?
Thanks
Depending on what you want to do, you'd have to figure out how best to filter out regular movement of a person just picking it up and putting it down, say - I don't know what the data on that looks like and how you could distinguish, et cetera. Or if the phone is sensitive enough to pick up anything meaningful beyond the sort of hard stop that would send it flying to the floor or whatever (they do seem pretty sensitive, but you'd have to test it).
You can determine speed using the GPS and it's pretty accurate (but this will use a lot of battery) and would naturally filter out small movements of a person using it within the vehicle. Implementation, I have no idea on that.

[Q] Is the threshold for registering a touch event adjustable?

I'm using an Adonit Jot style stylus on a Android phone (ZTE V5 Max on Android 4.4.4, but I experience similar problems with other devices), which generates less signal on the capacitive touch screen than a finger. The phone's software appears to be configured with only hand touch in mind, and drawing with the stylus results in broken lines a lot.
On the pointer location debug display the minimum pressure I can get for a touch point to register is 0.09, and with this stylus I get 0.10~0.11 most of the time, so the head room is really low. With a finger the pressure level is usually above 0.15. This makes me wonder if I can tweak the threshold for a signal to be registered as touch to make inking with the stylus more stable.
I'm aware of some Build.prop configs suggested here and there. However after playing with several versions and reading some of the documentation I believe those values only change the touch behavior after a touch point has been registered. The actual threshold for a signal to be recognized as touch is not changed, and my problem remains.
I've also came across this thread which appears to be very relevant, unfortunately the mentioned file system paths are not found on my 4.4 ROM.
Is there anything I can play with to change such a threshold? (I have root access.) Or is it hard-coded in device-specific drivers?

Improve one handed usage on smartphones.

So, with all the sensors and bio metric measures going on in our smartphones, many of them being more gimmicky than useful, paired with our constant need for big screens, how about improving the one handed usage of our future devices?
I mean, for starters, app menus aren't properly designed for one handed usage. Many require you to swipe from the left edge oh the screen to right to bring out a menu located on the left of your screen, starting from top to bottom.
Now you try reaching the top left corner menu of your handy smartphone without having to hold your phone in an uncomfortable, dangerous way (danger of dropping it) to reach the option you want to manipulate, be it the inbox, or recent messages, or compose an email, all of those options are usually located top left corner.
I know there are some solutions, some very archaic, like iPhone Plus' solution, that turns the big screen into a small one, or the more usual one handed keyboard found on your every Android phone.
But in my experience these solutions are not optimal, because, for example, in my Xperia, I have to go through like 3 clicks to turn the keyboard into a smaller one handed variation. Non optimal for quick access.
Why don't we have menu lists start from bottom to top so they are more within reach of our available hand/thumb? Why don't they design adaptable apps?
And that's where sensors come in, how come we don't have sensors in our smartphones that detect the with which hand we are holding our phone?
Be it left or right handed, in order to accommodate menus and tools within reach of your thumb, and have the interface of our smart apps adapt quickly and automatically to the hand we are holding our phone with, relying on sensor information. Be it the keyboard, email app, camera, or whatever your mind is capable of imagine.
Because let's be real, sometimes we quickly pull out our phone with either hand that's available on the go, and seamless access I think is needed in our everyday fast lives.
What do you think of this idea? Is it remotely doable? I'm no developer, but everyday usage create needs that are easily covered with current tech I think. I want to hear your thoughts.
I have worked with android sensors but I cannot imagine what kind of sensor the phone would have to have to detect if you are holding the phone with left or right hand. However somehow I imagine that this kind of sensor would not be difficult to make.
Someday I want to be a designer solving this kinds of problems and improving user experiences even by small amounts.
I think a touch sensor on the bezels would be enough.
DrKrFfXx said:
I think a touch sensor on the bezels would be enough.
Click to expand...
Click to collapse
I also envisioned something like this first. Now however I and thinking that maybe proximity sensor could be used to recognize the thumb.
Shouldn't be too hard to detect which hand you're using based on touches on the touchscreen when you scroll.
Sent from my LG-D855 using Tapatalk

Categories

Resources