Hey everyone!

I’m excited to introduce Reitti, a location tracking and analysis application designed to help you gain insights about your movement patterns and significant places—all while keeping your data private on your own server.

Core Capabilities:

  • Visit Tracking: Automatically recognizes and categorizes the places where you spend time, using customizable detection algorithms
  • Trip Analysis: Analyzes your movements between locations to understand how you travel whether by walking, cycling, or driving
  • Interactive Timeline: Visualizes all your past activities on an interactive timeline with map and list views that show visit duration, transport method, and distance traveled

Photo Integration:

  • Connect your self-hosted Immich photo server to seamlessly display photos taken at specific locations right within Reitti’s timeline. The interactive photo viewer lets you browse galleries for each place.

Data Import Options:

  • Multiple Formats Supported: Reitti can import existing location data from GPX, GeoJSON, and Google Takeout (JSON) backups
  • (Near) Real-time Updates: Automatically receive location info via mobile apps like OwnTracks, GPSLogger or our REST API

Customization:

  • Multi-geocoding Services: Configurable options to convert coordinates to human-readable addresses using providers like Nominatim
  • User Profiles: Customize individual display names, password management, and API token security under your own control

Self-hosting:

  • Reitti is designed to be deployed on your own infrastructure using Docker containers. We provide configuration templates to set up linked services like PostgreSQL, RabbitMQ and Redis that keep all your location data private.

Reitti is still early in development but has already developed extensive capabilities. I’d love to hear your feedback and answer any questions to tailor Reitti to meet the community’s needs.

Hope this sparks some interest!

Daniel

  • danielgraf@discuss.tchncs.deOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 days ago

    Thanks for getting back to me. I can look into it. I don’t think it’s connected, but you never know.

    The data goes the same way, first to RabbitMQ and then the database. So it shouldn’t matter, it’s just another message or a bunch of them in the queue.

    • ada@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Ok, so it may not be frozen. The numbers in the queue seem to imply it is, however, timelines and places are slowly filling out in my history. A couple of dates I had looked at previously were showing me tracklogs for the day, but not timeline information, and now, they’re showing timelines for the day

      • danielgraf@discuss.tchncs.deOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        That’s good, but I still question why it is so slow. If you receive these timeout exceptions more often, at some point the data will cease to be analyzed.

        I just re-tested it with multiple concurrent imports into a clean DB, and the stay-detection-queue completed in 10 minutes. It’s not normal for it to take that long for you. The component that should take the most time is actually the merge-visit-queue because this creates a lot of stress for the DB. This test was conducted on my laptop, equipped with an AMD Ryzen™ 7 PRO 8840U and 32GB of RAM.

        • ada@piefed.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Since I last commented, the queue has jumped from about 9000 outstanding items, to 15,000 outstanding items, and it appears that I have timelines for a large amount of my history now.

          However, the estimated time is still slowly creeping up (though only by a minute or two, despite adding 6000 more items to the queue).

          I haven’t uploaded anything manually that might have triggered the change in queue size.

          Is there any external calls made during processing this queue that might be adding latency?

          tl;dr - something is definitely happening

          • danielgraf@discuss.tchncs.deOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            This process is not triggered by any external events.

            Every ten minutes, an internal background job activates. Its function is to scan the database for any RawLocationPoints that haven’t been processed yet. These unprocessed points are then batched into groups of 100, and each batch is sent as a message to be consumed by the stay-detection-queue. This process naturally adds to the workload of that queue.

            However, if no new location data is being ingested, once all RawLocationPoints have been processed and their respective flags set, the stay-detection-queue should eventually clear, and the system should return to a idle state. I’m still puzzled as to why this initial queue (stay-detection-queue) is exhibiting such slow performance for you, as it’s typically one of the faster steps.

            • ada@piefed.blahaj.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              Well, the last update seems to have cleared the queue, and all of my history from that 10 year import now shows, with trips and places identified!

              But now, it’s having issues with importing the new google format import. I’ve got a 34MB file there that goes back to 2017, and this data says that it has imported, but then never appears in my history.

              If it’s relevant, there is overlap in the data, as my 10 year takeout import went up to 2023, and my “new format” import starts in 2017 and went a couple of days ago. I changed my google account in 2017, but logged in to both on my phone simultaneously, so I was accruing location data on both accounts at the same time for a while before I turned it off on my old account.

              • danielgraf@discuss.tchncs.deOP
                link
                fedilink
                English
                arrow-up
                1
                ·
                19 hours ago

                I am glad it worked out for you in importing the first file. I am still puzzled why it took that long.

                For the new format, did you have Android or iOS? With the timeline export from Google Maps on iOS, we can not do anything at the moment because there is actually no raw data in it. Only information like you stayed in this timeframe at that point and you traveled between these points. It’s actually a little bit funny that it aggregates to the same data Reitti uses in the end.

                If you are on Android, it could also be a bug when importing that file. I only had a small one from one of my accounts to test. If you mind creating a bug report, I will have a look. If you do not want to attach the export file there, feel free to send it to [email protected]. I will have a look at it then privately. No problem.

                For the overlap in exports, it depends. If the points are the same, meaning they have the same timestamp, then Reitti will discard them. If not, they will be handled like every other new data and will end in recalculating visits and trips around that particular time.