[Q] Query about smartphone's accelerometer - General Questions and Answers

Hi,
Is it possible to detect/measure car's acceleration or deceleration through a smartphone accelerometer ?
If yes, is it possible in all new smartphones ?
and how to implement it in android ?
Thanks

Depending on what you want to do, you'd have to figure out how best to filter out regular movement of a person just picking it up and putting it down, say - I don't know what the data on that looks like and how you could distinguish, et cetera. Or if the phone is sensitive enough to pick up anything meaningful beyond the sort of hard stop that would send it flying to the floor or whatever (they do seem pretty sensitive, but you'd have to test it).
You can determine speed using the GPS and it's pretty accurate (but this will use a lot of battery) and would naturally filter out small movements of a person using it within the vehicle. Implementation, I have no idea on that.

Related

Acceleration of the real movement

Hi, I'm trying to make use of the sensors built in the Android phone to detect the acceleration of the person carrying the phone. However, the problem is that the sensors are too sensitive for even a small motion, say tilting of the phone. But what I want is, no matter where and how (eg. the orientation of the phone) to put the phone on the body, the phone is always able to show me the accurate acceleration of the person...
So does anybody have any idea??? Thank you very much!
PS. I managed to cancel out the G force on x, y, z axis, and then theoretically the only acceleration left should be the one of the movement. But it didnt work that way, because there were still acceleration fluctuations on x,y,z axis whenever I touch the phone...
I don't get it.
So, you want to measure acceleration; and the problem is that the phone measures the acceleration?
about the "too sensitive" part. Is that really a problem? You can just get averages (for every second, every minute,...), max acceleration,...
also, cancel out G force? I haven't used the Dream accelerometer, but I guess it gives you separated values for static or dynamic forces, isn't it?
Wouldn't you just need to do the math to figure out 0? I haven't tested any of this and this isn't good math but lets pretend you see the motion on x as +6 and on Y it's -2. Then you would know it moved +4 for whatever duration... Does that make sense or am I way off?
paxku said:
I don't get it.
So, you want to measure acceleration; and the problem is that the phone measures the acceleration?
about the "too sensitive" part. Is that really a problem? You can just get averages (for every second, every minute,...), max acceleration,...
also, cancel out G force? I haven't used the Dream accelerometer, but I guess it gives you separated values for static or dynamic forces, isn't it?
Click to expand...
Click to collapse
Hi, paxku, Thank you. perhaps, let me give you an example. Say, I carry the phone and move in x axis direction. Now if the phone does not have any other forces on it except for the one moving it, theorectically the phone (the accelerometer in it) will show the reading of the moving acceleration. However, this is never the case, because when one carries the phone, the phone is inevitably shaken and tilted from time to time (think about when it's put in the pocket), which gives the phone many "interfering" forces. Thus, the reading will eventually show the sum of these accelerations, rather than only the moving one I want... So is there any way to get ride of these interference and only leave the moving acceleration.
"too sensitive", I actually meant "too sensitive to interfering forces". I tried to set a threshold to filter out the interference. But the problem is, a lot of time, the moving force is actually much smaller than the interfering forces, for example, when i carry the phone moving very very slowly. This ends up that the moving one gets filtered...
For the G force thing, I don't think my phone can give separate values. Does it really depend on the model of the phone?? I guess all Android phones have a same way of dealing with this. Check out this from Android Web.
Sensor.TYPE_ACCELEROMETER:
All values are in SI units (m/s^2) and measure the acceleration applied to the phone minus the force of gravity.
values[0]: Acceleration minus Gx on the x-axis
values[1]: Acceleration minus Gy on the y-axis
values[2]: Acceleration minus Gz on the z-axis
Thank you very much!!!
youneek said:
Wouldn't you just need to do the math to figure out 0? I haven't tested any of this and this isn't good math but lets pretend you see the motion on x as +6 and on Y it's -2. Then you would know it moved +4 for whatever duration... Does that make sense or am I way off?
Click to expand...
Click to collapse
Er..I guess the math that we are talking about is different. Actually, i don't need to calculate the resultant force;it's fine to just leave it decomposed on x,y,z axis. The math I did was that I decomposed G force to x,y,z axis, and then subtract Gx,Gy,Gz from the decomposed external force, in order to get ride of G force interference. The reason for doing this, you can refer to my reply above where I quoted from Android web how the acceleration readings are generated.
Would appreciate it if you have any idea....
How to get acceleration and orientation readings at the same time?
Hi, it seems that on Android acceleration readings and orientation readings are both from values[0], values[1], values[2], depending on which sensor type is being monitored. So if I want to get the acceleration and orientation data at the same point of time, what do I do?
I'm not sure if I'm doing this correctly,
public void onSensorChanged (int sensor, float[] values){synchronized (this) {
switch (sensor){
case SENSOR_ORIENTATION:
pitch = values[1];
roll = values[2];
break;
case SENSOR_ACCELEROMETER:
r_ax = values[0];
r_ay = values[1];
r_az = values[2];
break;
}
But I think this only gives me the values of the sensor that changes at the time. what if both sensors change at the same time???
Thank you!!!
How to get acceleration and orientation at the same time?
Hi..sorry to post the same thing again, cuz I really have to know this...
1. How to get acceleration and orientation data at the same time? I'm using onSensorChanged( ) , but this method only returns either acceleration or orientation at a time... I'm thinking, can I use getOrientation( ) together with onSensorChanged() to get the two at same time? If possible, how?
2. I want to know, is orientation actually derived from acceleration, in stead of obtained independently? If so, does anybody know how the orientation is computed from the acceleration?
Thank you very muchhhhhhhhhhhhh!!!!!!!
Stop opening new threads for the same topic....last warning
The major issue is that the google phone doesnt detect rotation.
Yeah you can get XYZ acceleration values, but part of that "value" is from roatation and the other part is from actual linear acceleration.
The first thing we need to figure out is What exactly are you trying to measure? If you're trying to measure movement using accelerometers, you may be disappointed.
Remember that accelerometers measure changes in speed, not changes in position. So, let's say I'm walking across a room at a constant speed and start taking measurements - I'll see accelerations up and down due to the fact that I keep changing from moving up to moving down and vice-versa, likewise for left to right measurements, even front to back as our gait is not perfectly smooth. However, over time, these accelerations will tend to cancel out, yet, I'm still moving.
The only way you're going to be able to calculate movement with accelerometers is to start with a known velocity and then calculate subsequent velocities and displacements from there on using accelerometer measurements (known in the navigation world as dead-reckoning).
Long story short: Movement in a straight line at a constant velocity has no acceleration (or force) component and, thus, cannot be detected or measured by an accelerometer.
fyp said:
Hi, I'm trying to make use of the sensors built in the Android phone to detect the acceleration of the person carrying the phone. However, the problem is that the sensors are too sensitive for even a small motion, say tilting of the phone. But what I want is, no matter where and how (eg. the orientation of the phone) to put the phone on the body, the phone is always able to show me the accurate acceleration of the person...
So does anybody have any idea??? Thank you very much!
PS. I managed to cancel out the G force on x, y, z axis, and then theoretically the only acceleration left should be the one of the movement. But it didnt work that way, because there were still acceleration fluctuations on x,y,z axis whenever I touch the phone...
Click to expand...
Click to collapse
Doesn't this whole idea go against the very basic laws of physics? You can't possibly tell the difference between the force of gravity and the force of the person moving forward. The best you could possibly do is try to figure out how to wait for the person to put the phone however they're going to have it and make absolutely sure, somehow, that it doesn't move at all and use that as a 0 point. Basic point is it's physically impossible to tell any 2 forces apart. I'm fairly sure that's part of Einstein's theory of general relativity(though it's been a while).
[email protected] said:
Doesn't this whole idea go against the very basic laws of physics? You can't possibly tell the difference between the force of gravity and the force of the person moving forward. The best you could possibly do is try to figure out how to wait for the person to put the phone however they're going to have it and make absolutely sure, somehow, that it doesn't move at all and use that as a 0 point. Basic point is it's physically impossible to tell any 2 forces apart. I'm fairly sure that's part of Einstein's theory of general relativity(though it's been a while).
Click to expand...
Click to collapse
Thank you! Yes, you are absolutely right. That's why I tried to eliminate G first, because Android defines 0,0,g as the acceleration on x,y,z when the phone lies flat on a surface. So if the phone is moved or rotated, g is decomposed automatically onto x,y,z, which I have to cancel them out. Now theoretically G could be eliminated by figuring out the values of decomposed G on x,y,z and then doing the reverse computation so that the only acceleration shown is of the movement. But a new problem is, to work out the values of decomposed G on x,y,z, I need to get the orientation of the phone, specifically, pitch and roll. But it seems that Android cannot report the acceleration values and orientation values at the same time instance, which causes that the calculation of decomposed G on x,y,z uses a wrong set of pitch and roll. Also, looks like orientation readings are actually derived from acceleration...which may also cause the wrong calculation of decomposed G.
Here is the fomulas I'm using to decompose G, and then to eliminate G on x,y,z
gx = (float) (GRAVITY_EARTH * Math.sin(roll*DEGREE_CONV));
gy = (float) (GRAVITY_EARTH * Math.sin(pitch*DEGREE_CONV));
gz = (float) (GRAVITY_EARTH * (Math.cos(pitch*DEGREE_CONV)*Math.cos(roll*DEGREE_CONV)));
ax = r_ax - gx;
ay = r_ay - gy;
az = r_az + gz;
where DEGREE_CONV is to convert degree to radian, r_ax,r_ay,r_az are the acceleration values directly from SensorListener(), ax,ay,az are the net accelerations which are supposed to be 0 when the phone is stationary.
Am I doing this correctly? I'm actually wondering, besides the problem that roll and pitch are not obtained at the same time as r_ax,r_ay,r_az are, the phone itself may not decompose G using exactly my formulas. So I have to find out how the phone does the decomposition in order for me to do the reverse process.
PS. honestly, Android keeping G by default is really annoying...
sjbayer3 said:
The major issue is that the google phone doesnt detect rotation.
Yeah you can get XYZ acceleration values, but part of that "value" is from roatation and the other part is from actual linear acceleration.
Click to expand...
Click to collapse
Thank you. Are you saying that the orientation/rotation values are not detected, but computed from acceleration values. This is just what i'm thinking too. That's why SensorListener() always reports new acceleration values first and after a short time, new orientations. and also, even sometimes when the orientation doesn't really change, just because there's a big change in acceleration, the orientation changes too. (for example, if you suddenly move the phone very fast horizontally, the screen orientation may change.)
So is there any way to get the acceleration and its corresponding orientation(pitch and roll) at the same point in time?
Sicarius128 said:
The first thing we need to figure out is What exactly are you trying to measure? If you're trying to measure movement using accelerometers, you may be disappointed.
Remember that accelerometers measure changes in speed, not changes in position. So, let's say I'm walking across a room at a constant speed and start taking measurements - I'll see accelerations up and down due to the fact that I keep changing from moving up to moving down and vice-versa, likewise for left to right measurements, even front to back as our gait is not perfectly smooth. However, over time, these accelerations will tend to cancel out, yet, I'm still moving.
The only way you're going to be able to calculate movement with accelerometers is to start with a known velocity and then calculate subsequent velocities and displacements from there on using accelerometer measurements (known in the navigation world as dead-reckoning).
Long story short: Movement in a straight line at a constant velocity has no acceleration (or force) component and, thus, cannot be detected or measured by an accelerometer.
Click to expand...
Click to collapse
Thank you for your answer. I'm aware of the accumulative error involved in dead reckoning. But now the error is acceptable to me, as long as the displacement that is integrated twice from acceleration shows me general direction of the phone's movement. The phone will always start calculation from stationary status, meaning the velocity is zero. For the constant speed moving, I think, theoretically, the displacement can still be computed, because the travelling time and velocity, which is calculated from the previous acceleration process, are known. Although for this case no acceleration is detected, it is just simply time*velocity, giving the displacement.
fyp said:
Thank you! Yes, you are absolutely right. That's why I tried to eliminate G first, because Android defines 0,0,g as the acceleration on x,y,z when the phone lies flat on a surface. So if the phone is moved or rotated, g is decomposed automatically onto x,y,z, which I have to cancel them out. Now theoretically G could be eliminated by figuring out the values of decomposed G on x,y,z and then doing the reverse computation so that the only acceleration shown is of the movement. But a new problem is, to work out the values of decomposed G on x,y,z, I need to get the orientation of the phone, specifically, pitch and roll. But it seems that Android cannot report the acceleration values and orientation values at the same time instance, which causes that the calculation of decomposed G on x,y,z uses a wrong set of pitch and roll. Also, looks like orientation readings are actually derived from acceleration...which may also cause the wrong calculation of decomposed G.
Here is the fomulas I'm using to decompose G, and then to eliminate G on x,y,z
gx = (float) (GRAVITY_EARTH * Math.sin(roll*DEGREE_CONV));
gy = (float) (GRAVITY_EARTH * Math.sin(pitch*DEGREE_CONV));
gz = (float) (GRAVITY_EARTH * (Math.cos(pitch*DEGREE_CONV)*Math.cos(roll*DEGREE_CONV)));
ax = r_ax - gx;
ay = r_ay - gy;
az = r_az + gz;
where DEGREE_CONV is to convert degree to radian, r_ax,r_ay,r_az are the acceleration values directly from SensorListener(), ax,ay,az are the net accelerations which are supposed to be 0 when the phone is stationary.
Am I doing this correctly? I'm actually wondering, besides the problem that roll and pitch are not obtained at the same time as r_ax,r_ay,r_az are, the phone itself may not decompose G using exactly my formulas. So I have to find out how the phone does the decomposition in order for me to do the reverse process.
PS. honestly, Android keeping G by default is really annoying...
Click to expand...
Click to collapse
Hi,
Your gx and gy is almost right, but gz is wrong.
Actually, gz is quite easy to calculate if we already get gx and gy. Just keep the following equation in mind:
gx^2 + gy^2 + gz^2 = G^2
Thus,
abs(gz) = sqrt(G^2 - gx^2 - gy^2)
Be careful about the sign of gx, gy, and gz. gz's sign can be determined by pitch value, if abs(pitch) > 90, gz should be negatie; otherwise, it is positive. BTW, I think gy's sign is wrong in your equation.

[Q] Governor app that can set profile for "text input active"?

Is there any speed-governor app for the Xoom that can be configured to lock the CPU to 1000MHz whenever the soft input area is active (or better yet, whenever Graffiti input is active), and/or a way to increase the digitizer sample rate?
Historically, Graffiti has been totally unusable on my Xoom. Literally, so low of a sample rate, and so many errors, that I just couldn't use it. I finally got around to unlocking and reflashing my Xoom to CM10 last night, and locking the CPU to 1000MHz makes it work a lot better... but the accuracy is still a cruel joke compared to even my creaky, old Hero overclocked to 711MHz.
It's pretty sad, actually. On the Hero, the digitizer seems to be reporting samples at least 4-16 times as often, and I can get nearly 100% accuracy without even trying. On the Xoom locked to max speed, it seems to do a tiny bit better than my S3 gets with stock, but the sample rate still appears to be absurdly low compared to what it was on the Hero, and feedback seems to lag the actual touch by at least 100-200ms. On the Hero, feedback was literally instant... stroke, and see the pixels turn white INSTANTLY under my fingertip. On the Xoom (locked to max), they start turning white a fraction of a second after I touch the screen, and I can see the last bit of the stroke render a fraction of a second after I lift my finger away. With the stock Xoom rom, it was more like, "draw the character, and see a jagged impression of it sputter into existence about a half-second later... maybe, MAYBE even getting recognized correctly about 70% of the time".
I'm guessing that either the Xoom's digitizer has a limited sample rate, or something in the kernel or driver is limiting the sample rate... but I'm still trying to find a straight answer somewhere about whether/how you can build a custom kernel without losing your ability to run paid Market apps. Or whether it's even necessary to go to that extreme, as opposed to something like a setting that tells Android to increase the sample rate, or not throttle the CPU when an input area is active, or maybe a way to let something like SetCPU identify "soft input area active" as a profile-triggering condition. I'm also pretty sure that the Xoom's kernel (if not recent versions of Android itself) try to treat the existence of a soft input area as an excuse to massively throttle the CPU, on the theory that it's just displaying a picture of a keyboard and waiting for a blunt press. HOWEVER, I'm SURE there HAS to be an equally-official way of defeating that behavior, if only because it would also screw up Android's ability to handle east Asian input methods.

Improve one handed usage on smartphones.

So, with all the sensors and bio metric measures going on in our smartphones, many of them being more gimmicky than useful, paired with our constant need for big screens, how about improving the one handed usage of our future devices?
I mean, for starters, app menus aren't properly designed for one handed usage. Many require you to swipe from the left edge oh the screen to right to bring out a menu located on the left of your screen, starting from top to bottom.
Now you try reaching the top left corner menu of your handy smartphone without having to hold your phone in an uncomfortable, dangerous way (danger of dropping it) to reach the option you want to manipulate, be it the inbox, or recent messages, or compose an email, all of those options are usually located top left corner.
I know there are some solutions, some very archaic, like iPhone Plus' solution, that turns the big screen into a small one, or the more usual one handed keyboard found on your every Android phone.
But in my experience these solutions are not optimal, because, for example, in my Xperia, I have to go through like 3 clicks to turn the keyboard into a smaller one handed variation. Non optimal for quick access.
Why don't we have menu lists start from bottom to top so they are more within reach of our available hand/thumb? Why don't they design adaptable apps?
And that's where sensors come in, how come we don't have sensors in our smartphones that detect the with which hand we are holding our phone?
Be it left or right handed, in order to accommodate menus and tools within reach of your thumb, and have the interface of our smart apps adapt quickly and automatically to the hand we are holding our phone with, relying on sensor information. Be it the keyboard, email app, camera, or whatever your mind is capable of imagine.
Because let's be real, sometimes we quickly pull out our phone with either hand that's available on the go, and seamless access I think is needed in our everyday fast lives.
What do you think of this idea? Is it remotely doable? I'm no developer, but everyday usage create needs that are easily covered with current tech I think. I want to hear your thoughts.
I have worked with android sensors but I cannot imagine what kind of sensor the phone would have to have to detect if you are holding the phone with left or right hand. However somehow I imagine that this kind of sensor would not be difficult to make.
Someday I want to be a designer solving this kinds of problems and improving user experiences even by small amounts.
I think a touch sensor on the bezels would be enough.
DrKrFfXx said:
I think a touch sensor on the bezels would be enough.
Click to expand...
Click to collapse
I also envisioned something like this first. Now however I and thinking that maybe proximity sensor could be used to recognize the thumb.
Shouldn't be too hard to detect which hand you're using based on touches on the touchscreen when you scroll.
Sent from my LG-D855 using Tapatalk

intelligent handling of touchscreen input ?

hello! my android 7 device may be called a smartphone but it is incredibly dumb when it comes to interpreting the input that comes to it from the screen when I touch it! I am just wondering if there are any apps or developer projects that seek to address this? no doubt it may be partly a hardware issue, maybe some devices have better sensitivity or let you tweak more settings than just the mysterious 'pointer speed'? but beyond this, are there any attempts to augment whatever module translates physical screen data into logical clicks, drags and so on? for instance, through active or passive training to distinguish drags from clicks, or by examining the screen pixels near where I click so a click in the vicinity of a text label is considered as an attempt to press the label, rather than assuming that I am pressing on an empty area of the background for no apparent reason? thanks in advance for any ideas! — Joseph

Remapping The Heart Rate Sensor

Back story:
Recently I bought these dirt cheap VR goggles, they suck, but for the time being it's all I have.
I've noticed that having a touch control over the phone was prety usefull, pausing a video, recentering, selecting, etc.
My goggles are basically wide open from the back, meaning the camera, jack, nfc (probably), usb and the Heartrate sensor are not blocked.
That in mind, I remembered that feature where you can take selfies with the Heartrate sensor and tought to myself "Heck this outta be easy!"
Let's start,
I need to use the heartrate proximity sensor to create a touch action in the middle of my screen
I first headed to the "Automate" app as it is quite flexible and simple (yet anoying) to use. To my avail, the sensor was not listed.
Tasker it was then!
Turns out tasker has a %HEART value but I am uncertain if it's even remotly close to what I need. (tasker.joaoapps. com/userguide/en/variables.html)
also one of my problems is, I don't even know how to activate the sensor in the first place.
I don't know anything about developping on mobile but,
Here's what I know:
-Samsung offers the Sensor Extension for developpers, but you need to register to ask for the dev kit via Email and apparently they don't respond to them very often. (developer.samsung. com/galaxy/sensor-extension)
-You can activate and monitor the sensors using the Samsung test menu
By entering *#0*# in the phone/call keypad and going to Sensor.
-Technically, for my purpose I would only want the proximity sensor of the Heart rate sensor but I believe they come in a package.
-I could attempt using NFC or bluetooth, but I guess I prefer to not cary something in my hand.
-I am sort of noob/alright when it comes to programming C++ so I can atleast understand code.
I'm asking for someone to shed some light if this was done before, I might be missing some obvious answer or other 3rd party app
I don't want to ask alot to be honest.
I guess other solutions will be much easier anyway, I just like to break my head thinking about other ways to do things.
Thx.

Categories

Resources