Has any fellow students used their Android Tablets to video record the lectures or seminars they attend for personal use whilst the tablet rested on the table?
Been thinking of getting an Android tablet for that purpose.
I know it will not be ideal, nor as good as a video camera. But I am asking if anyone has had actual experience in attempting in doing so.
Related
Hey guys, I just found this and was wondering if anybody has had a chance to check it out on the Gtab. Looks pretty sweet. I would have tried it out and let you know, but I'm deployed right now and can't get internet on the tab. (wired only)
GTalk Video
With so many people owning a tablet nowadays, it boggles my mind why we don't have a reliable video conferencing app for the android platform yet.
Can anyone provide insights into how one might feed video into an android tablet / etc?
I'd like to play around with testing the 3D screen with different video feeds, and I'm thinking about using a tablet such as those found over at aliexpress.
I dont care about the 'computer' inside there even, ideally I'd be interfacing as directly to the display as possible. I just can't find much raw data about the hardware itself.
Hi, I think this will be a bit different type question that you guys are use to seeing. I'm hoping some of the guru developers here have a suggestion or two.
Where I work we do video/image processing mainly for Gov, UAV's, stuff like that. I'm a HW engineer. Today a SW engineer came to me asking if there is a way we could tap the HW in an Android phone to be able to get some kind of timing or sync signals that they could use to correlate data from the internal IMU's (accelerometers) to data being recorded by the video camera on the phone. I believe they are looking into using the IMU data to stabilize video captured from the camera (similar to how we do it on other systems).
I'm not at all familiar with the HW in an Android phone but I image a lot of the key signals I'd need are locked up inside the CPU until somewhere.
To me this seemed like something SW could do. We agreed that writing the typical high level Java type app for the phone wouldn't work. Either too much delay, inaccurate timings, or the App would not be able to get direct access to the IMU data and the video stream to record both.
Is this something that can be done hacking the standard video app? Would we need to get down to the kernel level to be able to properly see/control the video camera and the IMU?
We don't work with Android so this would be very new territory to us. Let me know if you have any thoughts.
thanks
JM
I apologize if this is the wrong forum. I did search for the answer to my questions but I have such a poor understanding of these concepts that it's hard for me to even research. I ask for patience and simple explanations if possible, as my tech savvy level tends to be pretty low.
My parents bought an LG Google Tv (55GT) over the Holidays, finally caving in and buying a nice tv for te first time in my life. They kind of went all out on this one, deciding that for once they just want something that fills all their needs. Sadly it's proving to be incomprehensible on so many levels. Part of it is we have pretty bad internet for streaming, which we knew in advance.
The basics are working for us. Netflix, Amazon, and I even got a workaround for Hulu Plus.
I began looking for media servers to try and stream media from our laptop to the TV, and discovered Plex and Playon, as well as Twonky and the app that led me to this forum, BubbleUpNp.
My understanding so far is that Twonky, Plex, and Playon are merely servers, and that I required a product like Bubbleupnp to recognize and communicate with those servers on the TV. So far so good. Using Bubbleupnp (and I also tried Avia) I was able to see all our running servers, and ultimately I was able to access the media files I had stored on my computer (after messing around a bit, not all of them worked for video, all let me listen to my music)
Enter the problem I am working on now, having conquered all these other issues. All these servers come with great channels I can add on. There are academic channels with lectures, podcasters, tech stations, and on and on. However when I try to play these channels in Bubbleupnp, I get this error:
"bubbleup can be selected only for playing a video to an external renderer (ex: TV) from another ap (ex: file manager)"
Bubbleupnp gives me an option in the screen where I choose which server I want to browse to use a 'local renderer' or 'avia's renderer'. I don't know what this means.
What I am hoping for is someone to explain to me what rendering is, how it works, and how I can fix this issue. My parents would like access to all these cool channels, and I would like to slay my massive frustration with it.
What is a local renderer? Do I need to download a renderer to my tv? to my computer? How? does this have to do with video codecs or translating data into a language the TV can understand?
I would be grateful even to be pointed to something that explains this stuff. I hardly know where to start because my knowledge in it is so pathetic. Please help?
I have 4 of these cameras and i have them all working kinda ok, but there is no rtsp or any easy way to configure them.
My issues are that when motion detection they only record for 9 seconds. Thats it.
I can have them continuously record 24/7 to my NAS but I really don't want my NAS to be continuously spinning it's hard drive all day.
I just want it to record when and as long there are movement and then send that file to my NAS.
It shouldn't be such big issue, but then again its xiaomi...
I have sdcard installed and can access them from any place i am, wich is good, but i also can't choose a folder in my NAS to record to either.
Coming from Foscam and using ispy as a server on my laptop kinda worked so much better but i wanted HD and needed the upgrade anyway.
If anyone have suggestions or some hints about how to use them more efficient than Mi Home i Will be very happy.
Thank you..