LAMP Consortium

Timestamp / UTC time?

Hi!

I am trying to run the following code in python and running into an issue where it seems as if the code works fine, but it is not actually saving any csv files.

for participant in LAMP.Participant.all_by_study(‘SHORE40’)[‘data’]:
data = []
events = LAMP.SensorEvent.all_by_participant(participant[‘U1507533520’], origin=‘lamp.gps.contextual’)[‘data’]
for event in events:
data.append({
‘timestamp’: event[‘timestamp’],
‘UTC time’: “”,
‘latitude’: event[‘data’][‘latitude’],
‘longitude’: event[‘data’][‘longitude’],
‘altitude’: 1.0,
‘accuracy’: 1.0
})
# Don’t make CSV files for participants without any lamp.gps.contextual events.
if len(data) > 0:
pd.DataFrame.from_dict(data, orient=‘columns’).to_csv(f"{participant[‘U1507533520’]}.csv", index=False)

do I need to input a timestamp? If so how would I convert the date/time of study start and end to a timestamp format?

Ideally I am trying to make a csv for all participants with their sensor data that we can collate into one csv for analysis.

Thank you in advance!

If you’re trying to pull all available sensor data for a given participant, you don’t need to input a timestamp. By default LAMP.SensorEvent.all_by_participant returns the 1000 most recent sensor events for a given participant. The # of events returned can be customized using the _limit parameter (e.g. LAMP.SensorEvent.all_by_participant(‘U1507533520’, _limit=200, origin='lamp.gps.contextual') will return the 200 most recent gps.contextual reads). If _limit is negative, it will return the n=_limit least recent/oldest events.

So if you input a _limit which is larger than the total number of sensor events your participant has (either across all sensor or for one sensor, depending on whether you specify origin), it will return all of their data.

For the high-density sensors (lamp.accelerometer and lamp.gps, specifically), I recommend using the cortex.raw methods in the LAMP-cortex package, as trying to query millions of events with one API call may cause server timeout issues. These methods query sensor data by iteratively calling LAMP.SensorEvent.all_by_participant until all data is returned. You could also implement this functionality yourself, in your own environment.

Does this help?

p.s. in your line where you call LAMP.SensorEvent.all_by_participant, the correct syntax would be participant['id'] as opposed to participant[‘U1507533520’]

Yes thank you! I switched over to cortex and was able to save one participant’s screen_state data (yay!) using the following code:

cortex.run(‘U0624097060’, [‘screen_state’], start=0, end=cortex.now())[‘screen_state’].to_csv(’~/exportaccel.csv’, index=False)

however when i try to change the participant or sensor i keep getting the error message:

cortex.run(‘U0624097060’, [‘survey’], start=0, end=cortex.now())[‘survey’].to_csv(’~/exportaccel.csv’, index=False)
[INFO:feature_types:_wrapper2] Cortex caching directory set to: /Users/melanienisenson/.cache/cortex
[INFO:feature_types:_wrapper2] Processing raw feature “lamp.survey”…
Traceback (most recent call last):
File “”, line 1, in
File “/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/cortex/run.py”, line 61, in run
_res = func_list[f][‘callable’](
File “/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/cortex/feature_types.py”, line 78, in _wrapper2
saved[‘start’] = int(saved[‘start’])
ValueError: invalid literal for int() with base 10: ‘U0624097060’

could you help me decode this? i’ve tried it with a number of different participant IDs and sensor types but keep getting the same message ):

Thanks for your help!!

@rhays if you have any insight on this error i would greatly appreciate it! thank you!

Hi @mnisenson, sorry for the late reply. This has to do with a bug in the data caching (it’s incorrectly trying to read in the user id as a timestamp). Our data team is working on this and will release a fix asap!

ah okay thank you! glad to know it wasn’t an issue in my code. will you guys post somewhere when it is fixed? just so i don’t have to continually bother you about it (: