Work With Us

Sencity 2.0: Sending Our Smart City Platform Out for Another Spin

Sencity, an ongoing side project, first launched to coincide with the LoRa Alliance Standards Meeting in Philly in June 2017. Version 2.0 went out for a spin during DesignPhiladelphia 2017 and the Smart Cities Readiness Workshop in October. The platform is currently under construction.  

Sencity is a platform that helps locals and tourists navigate some of Philadelphia’s most well-known sites without getting caught in crowds.

Small IoT devices equipped with multiple sensors track crowd levels at Franklin Fountain, the PHS Pop-Up Garden, Betsy Ross House, the Museum of the American Revolution, and, of course, Pat’s and Geno’s, to provide guidance for best times to visit. The data is fed to a responsive desktop application.

How can we harness smart city technology to improve the lives of those living in and visiting Philadelphia?

Sencity is an internal project meant to provide a solution to a common problem while showcasing the elegance of the LoRa network. Smart city applications are about humans, not cities — we asked ourselves, how can we harness smart city technology to improve the lives of those living in and visiting Philadelphia?

We launched a beta version of Sencity last spring and used what we learned to launch an improved version earlier this month. Read about the origins of the Sencity project in our Introducing Sencity post, and access Sencity online by going to the Sencity dashboard.

What’s Different About Sencity 2.0?

Like all Bresslergroup side projects, Sencity is an opportunity to learn and experiment by developing a device that overlaps with client work. The number of LoRa-based devices we take on continues to grow. “We’re finding that more people are asking about LoRa,” says Senior Director of Electrical Engineering, Todd Zielinski. “They come in knowing about LoRa rather than needing to be introduced to it.”

I sat down with a few of the electrical engineers — Todd Zielinski, Ian Adam, Jim Wise, and Howard Cohen — who worked on this latest iteration to ask out about improvements and takeaways:

Q: What did you change from v1 to v2, and why?

A: The biggest changes made were for power management. We initially created a printed circuit board with a large array of sensors because we knew we’d want to experiment with different configurations, but it made the process of decreasing the system’s current consumption tricky.

We found that the time of flight sensor we had been using was effective for counting pedestrians, but it was too power hungry for this application. Ideally you want a smart-city application to last for days, not hours. So instead we’re using a low-power audio sensor to estimate activity based on ambient noise, which has increased run time by a factor of 5.

We also optimized the data transfer — the device is now reporting every 15 minutes as opposed to every 2 minutes. We’re using a GPS module to collect location data and show where the sensor is located on the map. (This came in handy when one of our bikes disappeared and we were able to track it to Camden.)

Q: Any other learnings from the first iteration?

A: We got feedback from some of the sites in the first round that they had interest in the data, and others who had concerns about it being out there. It raised interesting questions: What are the pros and cons of data transparency for different types of businesses? Which data sets do you expose? How do you use them? There is a certain amount of overcoming fear of certain data sets that needs to happen.

In general Philadelphia has better cooperation across city government departments than New York City does. OpenDataPhilly is a great project that merges data across domains. If we merged the data we’re tracking with Sencity, what sorts of correlations would we see?

We also learned that a lot of historic sites don’t have nearby bike racks.

Q: And from working on this improved version?

A: We learned more about the interaction between casework and the microphone and how important casework design is for microphone sensitivity. Pulse-density modulation (PDM) microphones are sensitive to dust and moisture intrusion so you want to have a port on your device that’s exposed and position the mic as close to that port as possible, ideally separated only by an air tight gasket. We used a Gore-Tex membrane that connects the interior and exterior but allows air to flow while blocking moisture and dirt.

And we continue to learn things that are going to help us help clients drive prices down and performance up.

Q: What has this project given you a chance to experiment with?

A: Antenna-tuning, PDM microphones, accelerometers, power management. We also worked with a programming language called Go we’d never used before to accomplish full-stack Web development in-house. It helped us get the back end up and running quickly.

Q: What will you change in the next round?

A: There are more things we can do with this package, especially because it houses a powerful, but energy-efficient ARM Cortex-M4 processor and there are a number of sensors on the board that we haven’t integrated yet. We’re thinking about applications such as water meters, cold-chain tracking, and other remote-sensing applications. We hope this is an ongoing process for us.

Need help developing a connected product? Read about our IoT product design expertise