My roommate installed a home security system called Canary. Once you install an app on your phone, your phone serves as a presence sensor and knows whether you are home or not. It time stamps when you entered and left the house.
When both my roommate and I are not in the house, the device is in “Armed Mode” which allows us to make a siren alarm go off. When one of us is home, it is in “Disarmed Mode.”
It is currently sitting on a table in front of the front door, and I can watch a live feed from my phone. It uses a HD camera for vision and sound. Whenever motion is detected, it records 10 minutes or so of footage(both image and sound) which can be accessed from timeline of my phone. Yes, it is creepy.
Canary device also senses temperature, humidity, and air quality.
My phone is busily buzzing today because I am getting notifications from SmartThings door sensor whenever lab room door is opened and closed, and also from Canary whenever motion is detected in the house.
I find it interesting how there is a huge market for all the camera systems for safe home. About a week ago when I was in Boston, there were so many posters of Nest’s camera in subway stations. What is it that drives people to buy cameras to watch their homes, which is the most intimate and private place of all places? Is privacy an inevitable price you have to pay for security?
Any how, I am being tracked everywhere, from lab room with two Kinects and multisensors along with door sensor, and Canary at home.
After spending a few days with the new gadgets the following facts emerged:
- The Lenovo laptop has Windows 7 pre-installed. Kinect SDK 2.0 requires Windows 8. Though the laptop came with a Windows 8 OS recovery CD we were unable to get Windows 8 working on it, which makes it of limited use for development purposes.
- The new Surface Pro 2 we got has severe Visual Studio installation issues. It also seems to be stack overflowing — as of writing is has become unusable since only 15 MB of hard disk space is left (our Skype meeting with Blase was temporarily cut off due to this issue).
- The HP tablet we ordered turned out to be our best purchase. It has Windows 8 out-of-box, a decent touchscreen and a great ergonomic keyboard. It’s perfect for deployment and development and currently HomeOS, the Z-wave stick and Kinect are all running on it simultaneously without a hitch so far (touchwood).
In conclusion we’ll probably need to return the laptop and Surface 2, and order more HP tablets in lieu.
Salient points from our meeting with Blase:
- Print out comment cards and run pilot study in lab with Jenny and I as guinea pigs.
- Make crash course for Anavi (which can also later be used for new UPOD members) to get her up to speed on the basics so that she can develop a clean Web-based interface for controlling lights and even answering survey questions.
- Fix Helios Hue light logging to only happen when the light state changes. Like all things Helios this won’t be as straightforward as it should be, and will require a foray into Python’s DB modelling and the Hue API. Currently all data models including lights inherit from a single class and changing the save behaviour for only the lights will require an override — which will also mean looking into SQLAlchemy’s ORM.
Finally we have good news to share today!
Let’s start off with a bad one : we have decided to put the blue Surface Pro 3 in the drawer because it seems to be faulty and malfunctioning in many cases. Even after uninstalling the old Kinect SDK and reinstalling a new one, it still did not recognize Kinect and freaked out when USB cable was plugged in.
For Surface Pro 2 and HP tablet, we figured out a way to install Zwave driver for Windows. Extracting before running the application was the key.
Then magic happened where everything else also worked. Given the issue raised on Microsoft website about Kinect needing to be the only device connected to one USB port, we thought that was the reason why Kinect kept on dying when plugged into USB hub with Zwave stick was because Kinect is resource-hungry. However, it turns out that they can coexist! Must have been a problem with the blue Surface itself.
We have a package ready for Professor Littman to take home and install. The box includes a Surface Pro 2 that can successfully run HomeOS and send both Kinect and multisensor data to Helios. Also, we wrote up documentation for self-deployment so that when we send the package to Blase, he can set things up himself as well.
Currently, there is a set of multisensor and Kinect connected to HP tablet that are logging data in the lab room.
Tomorrow we will work on preparing package for Blase, debug Helios, and write door sensor model.
Helios dev is back up! The supervisor script had been stopped and just needed to be restarted (followed by restart Helios and nginx reload).
With Helios back up multisensor data is logging (from the Z-wave stick connected to the blue Surface). However, the blue Surface seems to not be recognizing Kinect any more (even when plugged in directly) which is problematic to say the least. To get around these we’re using the black Surface (Surface 2) to log Kinect data. So the LabRoom now has multisensor, Kinect and light data being logged! (One caveat Hue lights are not working on the dev version, though they are on production, but hopefully with the ports open this will be easy to debug.)
We’re also setting up all the new devices we got today, and the Wi-Fi is clogging up a bit but the HP tablet and Lenovo laptop should be ready by tomorrow (which will allow testing of Kinect and Z-wave stick on same machine).
With Helios working again and HomeOS working, we can turn our attention to SmartThings. Jenny got hold of Professor Kraska’s SmartThings details, we’re thinking of writing a door-sensor-to-Helios app for SmartThings as one of the main targets for the week (as that will open up a large number of devices for logging and controlling).
Oh, and we also need to finish reading the API book, perhaps next week.
We are half way done. I am a bit careful to use an exclamation point — we have gotten some things to work, but we still have a long way to go. Continue reading
Several small bugs in Helios were fixed (related to the Multisensor view). Jenny and I had talked about trying to add functionality for trying to make an auto-device-addition-code-generator for Helios but the idea was squashed given the time required vs. the low number of further devices we intend to add (the automation idea could take up to 20 hours while adding a new device to Helios is 1-2 hours).
We also now have to stable test websites for Helios: upod.cs.brown.edu and upod-dev.cs.brown.edu . The dev version is for testing experimental features and allows us to try functionality that’s potentially broken.
Friday deployment at Professor Littman’s house nears…
Good to be back! I decided to become a terse writer: mostly because I am typing on a Surface Pro right now. Here are the updates: