Community tools and Hobby projects

Alain ESCAFFRE
24 replies1.2k views
Salvatore Iovene avatar

This thread is for individual members who would like to share small, non-commercial astronomy or astrophotography tools they’ve built as hobby projects.

To keep the forum organized and focused, standalone promotional topics for external apps, platforms, or services are not allowed. Please use this thread instead.

What qualifies

  • An individual hobby project.

  • Astronomy or astrophotography related.

  • Free to use at the time of posting.

  • Not operated by a company, startup, or commercial entity.

  • Not selling products, subscriptions, access, or services.

  • Not used to promote, funnel users to, or build awareness for another commercial product, service, or brand.

  • Not collecting users for a launch campaign, waitlist, or growth marketing effort.

Rules

  • One post per project (responding to specific questions is allowed).

  • Keep the description concise.

  • No affiliate or referral links.

  • No repeated bumps or promotional updates.

  • If the project later becomes commercial, do not create new promotional posts.

Moderators may remove posts that do not align with the spirit of this thread, and posts that don’t meet these criteria may be removed without prior notice.

If your project has already been announced on AstroBin, please do not repost it here. We keep a single announcement per project to avoid duplication.

Discussion and constructive feedback are welcome.

Well Written Helpful
Łukasz Pożarlik avatar

Hi everyone 👋

I’d like to share a hobby project I’ve been building: stargazer.earth.

It actually started as a simple tool to notify me about good observing weather - because, as many of you probably know, clear nights are rare… especially when you’re a busy dad trying to plan ahead 😉

Over time, it evolved into a more complete platform for planning and tracking astronomical observations. With 120+ users across all continents, it’s both exciting and slightly overwhelming. I’m doing my best to make sure it’s truly useful in practice.

What it currently offers:

  • 🔭 Smart weather notifications (fully customizable - cloud cover, precipitation, visibility, even azimuth limits if you observe from a balcony)

  • 🗂 Equipment management (scopes, eyepieces, filters, cameras)

  • 📊 Observation session tracking with duration logs & statistics

  • 🌌 Built-in Messier, NGC & Solar System catalogs

  • 🤖 AI assistant (Nova) for planning help and potential session summaries

The app is completely free and will remain so as long as I can cover the hosting costs. I may consider optional donations or small support tiers in the future, but for now it’s just a side project built by one astronomy enthusiast for fellow enthusiasts.

I’d love to hear your thoughts - especially from astrophotographers here on AstroBin.

Clear skies to everyone (and greetings to all busy dads out there!)


Łukasz

Stellar Nomads avatar

Great idea, Salvatore.

Well, I made a small project to make it easier for astronomers to visualize their field of view, I added the most common equipment, and I can continue adding if there are any requests.

Enjoy!

it is here: https://www.stellarnomads.com/fov/

📷 image.pngimage.png

Well Written Respectful Concise Engaging Supportive
Eduardo Piñas avatar

Hi everyone!

A few months ago I launched CelestView, an Android app that lets you open and view FITS, XISF, and TIFF files right on your phone or tablet.

Sometimes someone shares a FITS file with you and you can’t check it until you get back home to your PC. That used to happen to me all the time, and whenever I searched Google Play I couldn’t find an app that did what I needed—so I decided to build one.

Key features

  • Astronomical image viewer: Open and inspect your captures with high fidelity directly on your mobile device

  • FITS support: Easily open FITS (Flexible Image Transport System), the standard format in astrophotography

  • TIFF support: Open stretched or unstretched TIFF files

  • New: Open PixInsight XISF files (including files with multiple attachments)

  • Quick export: Convert to common formats like PNG or JPG without using desktop software—great for sharing results instantly

Upcoming updates

I’m committed to continuously improving CelestView. Future updates (besides quality and performance improvements) will include:

  • Support for viewing DSLR RAW images

If you’d like to try it out, I’d love to hear your feedback (features you miss, devices it works/doesn’t work on, etc.).

Download link

chvvkumar avatar

I made a Tempermonkey script to open targets from Astrobin in Stellarium and NINA.

This adds buttons to Astrobin page’s identified objects list at the top clicking which takes you to the target, zoom in to the target and lock on to it in the view (either with or with Oculars enabled).

To work it needs Tempermonkey and Remote Control Stellarium plugin enabled.

https://github.com/chvvkumar/cloudynights_customization/blob/main/astrobin-stellarium-nina.js

📷 image.pngimage.png📷 image.pngimage.png

📷 image.pngimage.png

Well Written Helpful Respectful Concise Engaging
Robert Žibreg avatar

Hi all. I’d like to share ASCOM Alpaca ESP32 Dome (Roll-off roof) & Safety Controller for Remote Observatory

This project upgrades the classic Rolling Roof Computer Interface (RRCI) to the modern ASCOM Alpaca standard using a single ESP32 dev board. It replaces the old USB-tethered Arduino solution with a completely wireless, OS-independent system for roof control. We have relay that’s controlling the roof motor, open, close and safety sensors. Safety sensor is attached to the mount while in the park positions so we have physical sensor on park.

📷 esp32_setup.jpgesp32_setup.jpg

The controller presents itself to your astronomy software (NINA, Voyager, SGP) as two distinct Alpaca devices:

  • Dome Device: Handles roof movement (Open/Close).

  • Safety Monitor: Prevents the roof from moving unless the telescope is safely parked.

****Dome Alpaca endpoints (Open/Close) do not enforce the park sensor and behave like a classic RRCI controller.****

  1. ****The Safety Alpaca device exposes the park sensor state; users who want interlocks should configure their automation (e.g. NINA “Safety Monitor”) to require this device to be safe before issuing dome commands.****

  2. ****If you want safety sensor for parking you have to use Safety Monitor driver in your astronomy software (like NINA). Then roof movement is blocked for unsafe condition (mount not parked)****

It eliminates the need for ASCOM Platform drivers on the client side—just connect over IP!

Features

  1. Alpaca Native: No Windows drivers required. Works on Linux (Indi), Mac, and Windows purely over WiFi.

  2. Integrated Safety: Physical "Mount Parked" sensor input acts as a hardware interlock. The driver rejects commands if the mount is not parked.

Dual Interface

  1. ASCOM/Alpaca API: Automated control for imaging sessions.

  2. Web Dashboard: A mobile-friendly manual control panel hosted on the device (Port 11111).

  3. Manual Override: The Web UI includes "Override" buttons to force movement during testing or emergencies, bypassing safety sensors.

  4. Low Cost: Built for ~20€ using standard components.

Hardware Required (~20€ on Aliexpress)

  • ESP32 Development Board: (Recommended: ESP32-WROOM-32U with external antenna connector for better range inside metal domes).

  • ESP32 Expansion Board (38-Pin): For easy screw-terminal connections.

  • 5V Relay Module (1-Channel): To trigger your existing roof motor controller.

  • 3x Magnetic Reed Switches (Normally Open): For detecting Roof Open, Roof Closed, and Mount Parked status.

Pinout & Wiring

Refer to User Settings in the attached sketch.

  1. Relay Trigger GPIO 26 Connect to Relay Input (IN). Triggers the roof motor.

  2. Roof Closed Sensor GPIO 32 Connect sensor between Pin 32 and GND.

  3. Roof Open Sensor GPIO 33 Connect sensor between Pin 33 and GND.

  4. Mount Safe Sensor GPIO 27 Connect sensor between Pin 27 and GND.

Note: The sensors use internal INPUT_PULLUP. Wire them to switch to Ground when active.

Libraries Needed (ArduinoIDE):

  1. ESPAsyncWebServer

  2. AsyncTCP

  3. (Standard ESP32 WiFi libraries)

Configuration:

  1. Open the sketch and locate the // ========= USER SETTINGS ========= section.

  2. CRITICAL: Change ssid and password to match your observatory's WiFi.

  3. The default port is set to 11111.

How to Connect (NINA / ASCOM)

Because "Alpaca Discovery" (UDP) is disabled to save resources for the web server, you must add the device manually.

  1. Open ASCOM Diagnostics.

  2. Choose Device > Choose and Connect to Device

  3. Select Dome as Device Type > Choose > Alpaca > Create Alpaca Driver

  4. Add Dome name eg. "ESP32 Dome" and click OK

  5. Enter the IP Address of your ESP32 when it connects to your WIFI

  6. Enter Port: 11111.

  7. Click OK.

  8. In the main Device Connection Tester Window (from step 2) Select Choose and pick your newly created Dome (ESP32 Dome), click Ok and then Connect to test.

  9. Do the same for Safety just pick Safety in "Select Device Type" dropdown in step 2

Web Dashboard

  1. Access http://[YOUR_ESP32_IP]:11111 in any browser.

  2. Standard Web Controls: Respect the "Mount Safe" sensor.

  3. Override Controls: Ignore sensors (Use with caution!).

  4. Status: Shows real-time feedback for "Roof Open", "Roof Closed", and "Park Sensor".

📷 esp32_UI.jpgesp32_UI.jpg📷 esp32_alpaca.jpgesp32_alpaca.jpg📷 esp32_alpaca_safety.jpgesp32_alpaca_safety.jpg

Safety Warning

!!! USE AT YOUR OWN RISK !!!

Remote observatory control carries risks of equipment collision.

The "Mount Safe" sensor implementation relies on your physical sensor placement.

Always have a backup method to cut power to the roof motor in case of software failure.

TODO: My router allows me to make a static IP based on MAC from my ESP32 board (which is what I did), but some people could benefit from static IP setting in sketch file so it's on my to do list.

Link to esp32 ino file https://pastebin.com/QkQr1W9e

Kevin Guion avatar

Hi everyone,

I'd like to share two free, open-source tools I've built for astrophotography:

AstroManager — File Management Suite

- File analysis — recursive scan of FITS/XISF/FZ files, automatic target detection, filter ID (LRGB, narrowband), telescope/camera recognition from 40,000+ equipment entries

- Compression — 9 profiles (zlib, zstd, lz4), bidirectional FITS/XISF/FZ conversion, SHA-256 verification, parallel processing

- Mass header editing — bulk edit FITS headers, NINA-compatible fields, filename pattern builder with undo

- Flat frame management — auto-grouping by date/setup/filter/binning, coverage reports, master flat tracking

- Target tracking & statistics — integration time per target/filter, best sessions, weather integration (Open-Meteo), visibility/altitude charts, SIMBAD lookup, dashboards, full JSON/CSV export

- Storage optimization — duplicate detection (name + content + compressed pairs), file organization presets

- Reports — LaTeX/PDF, HTML, CSV, AstroBin CSV with automatic filter ID mapping

- Plate solving — ASTAP and Astrometry.net, auto focal reducer detection

Handles 200,000+ files. https://github.com/ARP273-ROSE/astromanager

Backfocus Calculator — Visual Optical Train Builder

- Visual builder — telescope on the left, camera on the right, drag parts in order, real-time backfocus calculation with color-coded gap

- 12,100+ product database — 125 brands, 22 part types, 29 connection types (M42, M48, M54, EOS, Nikon Z, Sony E, ZWO bolt mounts…)

- Connection check — verifies thread diameter and male/female pairing, warns on mismatches

- Auto-complete — finds 1–3 part combinations from your inventory to fill the remaining gap

- FITS/XISF Backfocus Analyzer — star detection, PSF fitting, radial/tangential elongation diagnosis, 3x3 mosaic view inspired by Ekos Aberration Inspector

- Save/export/import — individual configs or full data as JSON

No external dependencies for the core app. https://github.com/ARP273-ROSE/backfocus

---

Both run on Windows/Linux/macOS, bilingual EN/FR, dark theme.

Feedback and suggestions welcome!

bobtobb avatar

Hi! I made something I think is awesome :)

I have a 360 camera, and wanted to get PERFECT horizon files into nina and stellarium. So I made a simple webbased tool (that does ZERO server-calls, and ZERO logging - save the page to your disk for offline usage when travelling to star parties!)

https://neuronburner.com/hrz-creator/

There’s a Load Example button that uses an image as an example. It displays your 360 image in two projections at the same time. Have fun!

And then it exports .hrz files etc :)

📷 image.pngimage.png

Helpful Respectful Concise Engaging
chvvkumar avatar

I also made a NINA dashboard for my desk that works with up to three rigs running NINA. It uses the NINA Advanced API plugin to provide the required data

https://github.com/chvvkumar/ESP32-P4-NINA-Display

Devicesummary.jpgOiii.jpgRA_DEC.jpgimage.pngautofocus.jpgimage.png

Alain ESCAFFRE avatar

Hi!

I am pleased to share a fabulous experiment: I have implemented and exposed an MCP Server for PixInsight. And also a set of skills / instructions. As a result , I work out my processing flow in Claude (chatgpt style), and Claude drives PixInsight to run everything. Processing an image starting from the raw integrated images takes really just 10-15 minutes (just the processing time) even for more than 20 processing steps. It is really great and revolutionary. I had posted it a few days ago but I guess I did an error as I don’t find my post anymore, I hope I was not considered as spam because it is a real project, so I start again.

You can find the sources of my project on GitHub: https://github.com/aescaffre/pixinsight-mcp

I posted videos in a comment down the thread.

Happy to discuss more if it raises interest. Here is a NGC 891 processed with it (it is a crop, C9.25 with 0.63 reducer, the frame is much larger)


📷 Capture d’écran 2026-02-20 à 00.08.28.pngCapture d’écran 2026-02-20 à 00.08.28.png

lunohodov avatar

Alain,

This is a really interesting idea!

Although I find processing to be part of the thrill, lots of people find it tiresome. This tool could help them get around the “boring” parts. I’m certain this approach has potential in other scenarios too.

Congratulations for thinking out of the (script) box and a job well done!

Well Written Respectful Engaging Supportive
Alain ESCAFFRE avatar

Thank you for your nice comment, yes, it is not uncommon to see thousands of frames getting accumulated by lack of time. That’s definitely one use case.

But the other is that , if you are picky, and looking for perfection, you can achieve like 20 various tests , even asking Claude to try and learn, and it works in no time. And this is still kind of the genuine way, we can build a lot on this, and I guess it has the potential for raising the level end results. I’ll share a video by the end of the week end.

Spacetime Pictures avatar

Hi Alain,

Very interesting approach to conventional processing, especially for iterative purposes as you said. Thanks for sharing! Looking forward to your video :)

Cheers,

Laurent

Well Written Respectful Supportive
Jos Stassen avatar

Hi Alain, what a great idea! I definitely want to try this myself. I’m really looking forward to the video, as I’ll need some guidance to set it up on my own. Thanks for recording it!

Well Written Respectful Supportive
Tony Gondola avatar

Interesting and I’m sure it’s the future. Really it comes down to what parts of the hobby you enjoy. If it’s all about production and you dislike the processing end, this will be great. If you enjoy the processing end, it’s boring. Personally, I enjoy the creativity and experience of doing it myself.

It’s very much the same as the difference between actually playing music verses having Suno AI create it for you. Kudos on the work but it’s just not for me.

Respectful Concise Engaging Supportive
Michael avatar

Hello Alain
… and wowww, great.

As I am using Google-Gemini, I instantly started a thread with Gemini, whether this can also work with Gemini.
The answer was yes, plus step-by-step installation guide for my MacBook.

Is it okay for you that I download your githup code as zip … and what are my duties to use it?

Best regards.

Michael

Alain ESCAFFRE avatar

Here are the videos,

Disclaimer, I am not a YouTuber, it is very very raw, too long, and I have no software to cut waiting time, it just a straight recording with QuickTime, so feel free to fast forward. Also, yes, results are not convincing, but I am not here for the marketing, but for sharing the concept and hopefully also getting Pixinsight move on this topic too ;)

Maybe I’ll do something more worked out later, when I have more experience with my system.


- A first video with a few slides to explain the concepts, and an attempt at p processing only an L frame. Since it ws the first time I was doing this, there is a part where Claude evolves the script that is a bit long, you can jump out of that part.
https://youtu.be/XLXS9vysVEs (part 1)
https://youtu.be/c-kPSQEipLg (part2 because I stopped to quickly )

- A second video that is more straightforward, it starts immediately with processing some R, V ; B ; L , Ha frames resulting from WBPPP. It doesn’t go until a nice result though, it would have been too long. but you can see well the principle of iterations, and also, I show nearly at the end another part: a web application that shows all the images of all the steps when you want to check. https://youtu.be/VT4oCp8fUz0









Alain ESCAFFRE avatar

Michael · Feb 22, 2026 at 05:58 PM

Hello Alain
… and wowww, great.

As I am using Google-Gemini, I instantly started a thread with Gemini, whether this can also work with Gemini.
The answer was yes, plus step-by-step installation guide for my MacBook.

Is it okay for you that I download your githup code as zip … and what are my duties to use it?

Best regards.

Michael

No problem with using it, it is open source, MIT License. Also it works with any agentic framework as. principle, but it needs to be on desktop, not the ChatGPT or gemini online services nor from your phone obviously.
Also I really think Claude is the best (clode code or Claude desktop)

Vilen Sharifov avatar

Hey everyone! I'd like to share a project I've been working on — Athenaeum, a free desktop app for managing astrophotography FITS and XISF files.

I built this primarily for myself because I got tired of manually sorting through thousands of frames from multi-night, multi-camera setups. It turned out useful enough that I decided to release it for everyone.

What it does:

File Manager — Monitor directories and instantly browse metadata from your FITS/XISF files

Objects Library — Automatically groups your light frames by sky coordinates, no manual tagging needed if your metadata is valid

Shoot Calendar — See what you shot on any given night at a glance

Calibration Matching — Configurable matching of darks, flats, bias, and dark flats by camera, gain, temperature, binning, date and etc

WBPP Export — Export a folder structures of multi-cam and multi-night data that will be PixInsight WBPP-ready if you will apply the necessary keywords that app will provide, so you will get full calibration chains

Sky Chart — Interactive map of all your targets across the sky

Blinker — Fast frame blinking for spotting artifacts and tracking issues, the core of it is my library for fast conversion of FITS/XISF https://github.com/eg013ra1n/rustafits

Duplicate Detection — Find and clean up duplicate files using xxHash checksums

Available for Windows, macOS, and Linux: https://artfrom.space

The app is free and open-source. Feature guides and manuals are on the website if anything is unclear. It's still early days, so if you run into any bugs or have suggestions — please reach out via DM or join the Telegram group: https://t.me/athenaeum_astro📷 Screenshot 2026-02-23 at 12.07.03 AM.pngScreenshot 2026-02-23 at 12.07.03 AM.png📷 Screenshot 2026-02-22 at 11.47.09 PM.pngScreenshot 2026-02-22 at 11.47.09 PM.png📷 Screenshot 2026-02-22 at 11.47.21 PM.pngScreenshot 2026-02-22 at 11.47.21 PM.png📷 image.pngimage.png📷 Screenshot 2026-02-22 at 11.47.41 PM.pngScreenshot 2026-02-22 at 11.47.41 PM.png

Well Written Helpful Respectful Engaging
Peter Quadarella avatar

Interesting. I was thinking about attempting this with an agent through Openclaw. I already have a dedicated and isolated OpenClaw box running where I can just drop files for a (multi-session) target and it manages my library and automatically stacks it using Siril. The obvious next step is processing but that is a bigger deal. Good job!

Brett Johnson avatar

I’m very interested in this… please let me know what and how I can help test. I have VS Code setup with Claude - so can look at making this happen…!

Respectful Supportive
Bob ten Berge avatar

Kevin Guion · Feb 21, 2026, 09:47 AM

Backfocus Calculator — Visual Optical Train Builder

- Visual builder — telescope on the left, camera on the right, drag parts in order, real-time backfocus calculation with color-coded gap

- 12,100+ product database — 125 brands, 22 part types, 29 connection types (M42, M48, M54, EOS, Nikon Z, Sony E, ZWO bolt mounts…)

- Connection check — verifies thread diameter and male/female pairing, warns on mismatches

- Auto-complete — finds 1–3 part combinations from your inventory to fill the remaining gap

- FITS/XISF Backfocus Analyzer — star detection, PSF fitting, radial/tangential elongation diagnosis, 3x3 mosaic view inspired by Ekos Aberration Inspector

- Save/export/import — individual configs or full data as JSON

No external dependencies for the core app. https://github.com/ARP273-ROSE/backfocus

---

Both run on Windows/Linux/macOS, bilingual EN/FR, dark theme.

Feedback and suggestions welcome!

Hi Kevin!

Very interesting and helpful tool to visualize and help find the right back focus for those that are having trouble with this subject.

One question though, how would you take filter thickness into account, when applicable?

One typically needs to add 1/3rd of a filter’s thickness to the total back focus (so 55mm back focus becomes 56mm when using 3mm thick filters).

Filters aren’t part of the catalogue, but perhaps it would be something to point out when selecting a filter wheel or drawer for the optical train?

Alain ESCAFFRE avatar

Brett Johnson · Feb 23, 2026 at 09:51 AM

I’m very interested in this… please let me know what and how I can help test. I have VS Code setup with Claude - so can look at making this happen…!

I will commit regular improvements as I use it. for now it is still kind of for people who like to explore new territories, as in, no greatly productized, and sometimes wow, sometimes frustrating. But if you want to try, I’ll be happy to help / exchange / collaborate. I think no need of vs code, just Claude code from terminal is good. (or from desktop).not sure how to exchange email / if there is a DM feature here.

akshay87kumar avatar

I’ve been incorporating AI tools more deeply into my professional work over the past few years - using them to structure thinking and build small utilities around real problems. Recently, I started extending that exploration into astrophotography as well.

The current project I’m working on is a simple deep-sky target planning tool. The goal is straightforward: help answer the question, what should I image tonight based on the equipment I have and will it actually frame well with my setup? The tool looks at what’s in the sky for a given night and location, and then evaluates suitability based on practical constraints like focal length, camera sensor size, and general visibility window.

Instead of just browsing very long object lists, the idea is to see targets that suit your equipment. Will this nebula meaningfully fill the frame? Is this galaxy going to be too small at my focal length? Does the target stay high enough in the sky for a reasonable imaging session? It’s less about complex modeling and more about bringing these considerations together into one planning view.

This isn’t a commercial effort - it’s primarily a learning and experimentation space where I’m exploring how AI can support structured planning in our hobby. That said, I’m very open to thoughtful feedback or serious discussions around workflow, framing logic, or how others approach target selection. explore it here.

Link: AI Playgound - Dr Gopal Krishna Kunwar Observatory

📷 Target Selector-Banner.jpgTarget Selector-Banner.jpg

Interactive Sky avatar

Hi everyone 👋

I’d like to share a hobby plugin I’ve been developing for N.I.N.A. called Interactive Sky.

It’s a planetarium-style interface integrated directly inside NINA to make framing and camera rotation more visual and interactive.

What it does:

• Live FOV overlay
• Drag + rotate interaction
• Camera rotation detection via plate solving
• Mount position marker
• Sync Scope to FOV (Slew)
• Improved FOV stability

I currently use an Alt-Az mount, so I would especially appreciate feedback from EQ mount users to verify rotation and sync behavior across different setups.

The plugin is completely free and open-source.
Download (GitHub releases):
https://github.com/InteractiveSky/nina-interactive-sky/releases

Constructive feedback is very welcome 🙂

Clear skies! 🌌