Syncing sensor data from local machines to the (Amazon) Cloud and vice versa

Reach out to live edit for the historical?
live edit
Okay so this is a test a typing whiel certain things are happening
OTB Warehouse diagram
Break down the technology used
Use AWS email sender for mail
Break down schema
Update live edit and warehouse
Look into live edits API
Issues with duplicate local database
Do we trust the client to validate it’s input
If the client pc fails, hard drive explodes, etc we have no backup as of now
Performance data can’t be matched with user id’s etc, as this data is local, and we can’t query it
Attempting to sync everything requires redeveloping the existing application
Having a local cache for high demand writes is a good idea
Diagram of the database relations, including DynamoDB
I am architecting the performance type event flow to make the least amount of impact to the existing code base. Switching out a local SQL server for a remote one is a fairly trivial task, but implementing a local copy of that specific studios data, and syncing it with our database will cause many undesirable edge cases. We can build around momentary drops in internet connection, but operating without internet access brings up serious concerns.
Thus far we have been discussing the modularization of our buisness functionality, and a large majority of it will live on the Cloud. Otherwise we will have to implement the functionality on every single client we build, and will have to maintain each individually.
If our studios expect to be able to provide the best technolly to our members, they need to maintain an Internet connection.
Performance events will be queued on the client until they are posted to our DataStores.
Future considerations
We might end up developing a single API endpoint that the client application can post to directly, and that API service will be responsible
for populating the consumers (Live Edit, DynamoDB, logging, etc). This would simplify the client’s implementing significantly as they would only be responsible for posting their queued performance type objects to a single API
The backend services would be built out in either NodeJS, Java, Python (so far the only languages that AWS Lambda supports)
At this time our client (studio app) will be responsible for posting to both of these API’s
some more testing of how it is going on and try typing hows it going. Good okay, i’m sorry
This is a test sample of what it is supposed to be
Ideally the local data store would act similar to Amazon’s SQS service. Which is a queue service that works on 1st in 1st out basis.
Services consume this data by polling the queue new events, and then process the input, and generate it’s own out (saving it to Dynamo, 3rd party service like Live Edit)
For maximum use of space, we will be keeping the performance type object as small as possible.
Performance Table will look like this
Key – Type
1 – Treadmill
2 – Rower
3 – Free Weights
4 – Bike
Note that we can create search able indexes based on top level attributes, all other misc data should be stored within the Array<Object> event_data property
69, 120, 000 requests will be created in a single day for a total of 800 studios that execute 4 classes.
why doesn’t this stop so i can keep doing the work on. I’m  also trying to get this done. Cache






Leave a Reply

Your email address will not be published. Required fields are marked *