Nice. I worked on a project using SOHO imagery that would do something similar where the images would be displayed on a large screen similar to the observatory on the ship from Sunshine. It was meant for a classroom for an observatory, but it just never made it died on the vine. It's cool to see a project with something I have actual experience in how the back end experience is like.
Good eye! That's the Sun's own rotation — ~27 days (Carrington rotation period) at the equator, it's plasma, so slower at the poles. 24hrs ≈ 13° of longitude ≈ ~7% of the disk. 1/365 would be Earth's orbit, which is a different motion :)
Thank you! It is live on Android, in review on App store and hopefully live shortly. Will remove that hyperlink from the Appstore image until it's live
Thank you so much. This is one my favorite projects, few bugs, straight forward. I find it refreshing too to sometimes take a step back and observe the Sun and space.
It's on Google play store for android phones under Lumara, hopefully on Appstore within a day or so too! I find the Desktop experience the best though since it includes the ISS live cam feed of the Earth.
That's raw NASA SDO satellite footage. Claude (Opus 4.7) was used almost exclusively for building the site. Static site on Render (no hosting fees), pushed from Github. Uses NASA API's (free), a very cost-friendly project on the ole wallet!
I'll add that "raw" is after a bit of postprocessing to make it pretty.
When the SDO webserver went down a few months ago I rebuilt the L1 data processing pipeline from JSOC so we could still do outreach and there's a surprising amount of opinion that goes into the mapping of data to visualization for each wavelength. My composite movies came out looking more like an acid trip than solar data.
Touché — when the person who rebuilt the pipeline says it's not raw, it's not raw :)
Is optical-flow interpolation a step too far for outreach, or fair game? Tempted to motion-interpolate (ffmpeg's minterpolate) the daily MP4s up to 60fps for Lumara— looks gorgeous but the in-between frames are extrapolated. You're totally right about "raw", I suppose I meant more straight from NASA APIs.
Nice. I worked on a project using SOHO imagery that would do something similar where the images would be displayed on a large screen similar to the observatory on the ship from Sunshine. It was meant for a classroom for an observatory, but it just never made it died on the vine. It's cool to see a project with something I have actual experience in how the back end experience is like.
Looking at the sun daily timelapse. It looks like the rotation of the sun is more that 1/365th of the sun diameter. What am i missing?
Good eye! That's the Sun's own rotation — ~27 days (Carrington rotation period) at the equator, it's plasma, so slower at the poles. 24hrs ≈ 13° of longitude ≈ ~7% of the disk. 1/365 would be Earth's orbit, which is a different motion :)
The Appstore button redirects to https://beeswaxpat.github.io/lumara-legal/
Thank you! It is live on Android, in review on App store and hopefully live shortly. Will remove that hyperlink from the Appstore image until it's live
Looks refreshing. Titles can't capture visual projects like these
Thank you so much. This is one my favorite projects, few bugs, straight forward. I find it refreshing too to sometimes take a step back and observe the Sun and space.
It's on Google play store for android phones under Lumara, hopefully on Appstore within a day or so too! I find the Desktop experience the best though since it includes the ISS live cam feed of the Earth.
"Live" from the sun, minus the ~500 lightseconds it takes to get here :)
Also the videos are made with frames from every 12 seconds or so over 24 hours, I am definitely using "live" very liberally :D
I can see Claude
That's raw NASA SDO satellite footage. Claude (Opus 4.7) was used almost exclusively for building the site. Static site on Render (no hosting fees), pushed from Github. Uses NASA API's (free), a very cost-friendly project on the ole wallet!
I'll add that "raw" is after a bit of postprocessing to make it pretty.
When the SDO webserver went down a few months ago I rebuilt the L1 data processing pipeline from JSOC so we could still do outreach and there's a surprising amount of opinion that goes into the mapping of data to visualization for each wavelength. My composite movies came out looking more like an acid trip than solar data.
Touché — when the person who rebuilt the pipeline says it's not raw, it's not raw :)
Is optical-flow interpolation a step too far for outreach, or fair game? Tempted to motion-interpolate (ffmpeg's minterpolate) the daily MP4s up to 60fps for Lumara— looks gorgeous but the in-between frames are extrapolated. You're totally right about "raw", I suppose I meant more straight from NASA APIs.
Awesome! Now I wish screensavers were a thing again.
Me too! I kind of forgot about them for a minute. You see more screensavers on TV now than on the computer!