Normal view

There are new articles available, click to refresh the page.
Before yesterdayEnte

Exporting Google Photos with Takeout

9 November 2024 at 07:00

Google Photos is an incredibly successful product with over 100M+ paying customers storing a lifetime of memories. However, there are many possible circumstances in which you want to takeout your memories from Google Photos. Unfortunately, the tool that Google offers to do this is not the best and most reliable (we ranted about this some time back), as one can see from all the Reddit posts complaining about Google Takeout

Users complaining about
Google Takeout on the official Google Photos Subreddit

This document, therefore, is intended to be a guidebook to help you export your memories from Google Photos for different use cases like

  • Migrating your memories to Apple iCloud
  • Keeping a backup copy of your memories
  • Exporting all your memories to a different service

Lets get started

What is Google Takeout?

Google Takeout is Google’s data portability project which allows users of most of their services (including Google Photos, Chrome, Maps, Drive, etc.) to export their data. We had written about their history here in case you are interested. There are two ways that Google Takeout works

  • Download the data in multiple zips of max 50GB and then use it how you want
  • Transfer directly to a selected set of services - Apple iCloud, Flickr, SmugMug, and OneDrive

For most of the use cases, we will have to go through the zip route.

Migrating from Google Photos to Apple iCloud

This is a fairly easy and straight forward process where you don’t have to do a lot except set it up, and wait for your photos to be transferred to the Apple ecosystem. Lets go through the exact steps

First step
of transferring from Google Photos to Apple Photos
  • Next, make sure you’re signed into the right Google account that you’re using for Google Photos. You can verify and switch account, if needed by clicking on the top right most icon in the above screen
  • Before clicking on Continue, also verify the albums you want to move to Apple iCloud. By default “All photo albums included” is selected. You can click on that and select particular albums as well. Photos that are not in any album on Google Photos are usually albumised by year, like “Photos from 2023”, etc.
  • Click on Continue. Before you go on to select which service to transfer your memories to, Google will ask you to verify your account.
Second step
of transferring from Google Photos to Apple Photos
  • This step is about setting up the service you want to transfer your memories to. The options available are iCloud Photos, Flickr, OneDrive and SmugMug. The exact same steps should work for all the service, but for this article, we will restrict ourselves to Apple iCloud
  • Select Apple iCloud in the “Move to:” dropdown. And then click on Continue
  • This will trigger a sign in flow for Apple iCloud. Please complete the same, after which you will come back to the Google Takeout page.
Third step
of transferring from Google Photos to Apple Photos
  • Before moving forwards, do make sure you have enough storage capacity on your iCloud account, so that you don’t run into any issues while transferring
  • You can now click on “Agree and continue”. This will start the export process, and you will receive e-mails from both Google and Apple notifying you of this transfer
Fourth step
of transferring from Google Photos to Apple Photos
  • The export process can take a few hours or even more than a day, depending on the size of your collection. However, you dont have to do anything else. Both Google and Apple with e-mail you once the process is completed. After which you can see the photos in your Apple iCloud
  • Note that your memories will continue to exist on Google Photos, and new photos taken from your devices will continue to be uploaded to Google Photos. You will have to actively delete your data and/or stop upload to Google Photos to solve this.
  • Some issues we found while doing this on our personal accounts
    • For Live Photos, Google just transfers the image file to Apple, and not the video file, making Apple treat it like a regular photo instead of a Live photo
    • If you have some photos that are both in your Apple Photos and Google Photos account, it would lead to deduplication. You can use “Duplicates” in the Utilities section of the iOS app, to detect and merge them. However, it does a less than perfect job, so a lot of duplicates would have to be manually deleted.
    • A bunch of files which were not supported by iCloud Photos (few raw images, AVI, MPEG2 videos, etc.) were moved to iCloud Drive. So your data isnt lost, but some part of the Google Photos library wouldn’t be available on iCloud photos

Keeping a backup copy of your memories from Google Photos

There are many ways to do this. The path we are focussing on here is for those who use Google Photos as their primary cloud storage, and want to keep a copy of all their photos somewhere else like a hard disk or another cloud storage.

First step of
exporting Zips from Google Photos via Takeout
  • By default, a large number of services are selected for export. Click on “Deselect all”, and then scroll down to Google Photos, and click on the selection box next to Google Photos. On the top right it should show 1 of xx selected.
Second step of
exporting Zips from Google Photos via Takeout
  • Review which of your memories are selected for export. For backups, you would want to select all the memories on Google Photos. Click on “Multiple formats” to make sure all formats are selected. Similarly make sure that “All photo albums included”. You can also choose which specific formats or albums to export in case you want that
  • Now you can scroll to the bottom of the page, and click on “Next step”. This next step is about various options for your export
Third step of
exporting Zips from Google Photos via Takeout
  • There are 3 sections here you have to choose the options that works for you
    • Destination - Where your exported files are going to go
      • If your backup location is one of OneDrive, Dropbox or Box chose that. Make sure you have enough storage space on these services
      • If you want to backup on a hard disk, or other cloud storage provider you can choose either Google Drive or download link via email. If you’re choosing Google Drive make sure you have enough storage space on Drive
    • Frequency
      • Given its a backup copy while you want to continue using Google Photos, you should choose “Export every 2 months for 1 year”. Two important things to note here
        • Google Takeout doesnt support incremental backups. So the export after 2 months is going to be for your entire library. So to save storage space, you will have to delete the old backup where ever you’re storing this, once the new exports are available
        • The export period is only 1 year, so you will have to do this again every year to ensure your backup copy has all the latest memories stored in Google Photos
      • If you have other usecase - like moving into a different service or ecosystem and stop using Google Photos, you can do a “Export once” backup.
    • File type and size
      • Google gives two options for file types - .zip and .tgz. I personally prefer .zip as decompression is supported on most devices. However, .tgz is also fine as long as you know how to decompress them
      • For size, Google gives options ranging from 2GB to 50GB. Note that if your library size is large, you will get multiple zip files to cover your entire library. For large libraries, we would recommend keeping the size to 50GB. However, if you have a bad network connection, downloading these 50GB files might take multiple attempts.
  • Once you have made the selection, you can click on “Create export”. The page will now show a “Export progress” section
Fourth step of
exporting Zips from Google Photos via Takeout
  • Google Takeout will send you an email once the export is completed. You can download the zips from the link provided in the email (if you had selected “Download link via email” as destination) or go to the selected destination (Google Drive, OneDrive, etc.) to download the zip files. Note that you only need to download the zips if you want to have the backup in a different location than what you chose as destination
  • Before downloading this, make sure your device has enough free storage. These zip files contain your entire photo and video library and therefore might require a large amount of device storage
  • Once you have downloaded the zips, you can move/upload the zips wherever you want to act as a backup for Google Photos. This could be a hard disk or another cloud provider
  • If you had selected “Export every 2 months for 1 year” as frequency, you will get an email from Google Takeout every 2 months. And you will have to repeat the download and upload process everytime. Note that you can delete old zips once latest backups are available as every 2 months you will get a full backup and not an incremental backup. Otherwise you will consume a lot more storage space than required.
  • Please note that the above process keeps the zip files as backup. If you want to unzip the files so that the actual photos are available, see the next section

Moving all your memories to another service

The last section mostly covered keeping an extra backup while you’re still using Google Photos. That’s why we kept the zip files as backup, as you don’t need to unzip and keep the uncompressed folders for backup purposes.

However, there are definitely usecases where you would want to uncompress the zips. For e.g., if you want to move your entire library to your hard disk, or another cloud, you would want to make uncompress the zips, ensure your metadata is intact and then upload it.

So how do we do this?

  • Make sure you have enough storage space (atleast 2 times your Google Photos storage) on your device
  • Follow the steps of the previous section to download the zip files from Google Takeout
  • Uncompress the zip files one by one
  • When you open the uncompressed folders you will notice the following
    • The base folder name is “Takeout”, and within that there would be another folder called “Google Photos”
    • Inside “Google Photos”, you would have folders corresponding to albums on Google Photos. Photos which are not part of any albums on Google Photos are automatically added to year wise albums like “Photos from 2022”, etc.
    • Inside each album, you will see the files associated with your photos and videos
      • Your photo or video media files. These would be jpeg, heic, mov, or mp4, etc.
      • A JSON file with the same name as your photo/video containing all the metadata of your memories like location, camera info, tags, etc.
First step of exporting from Google Photos to another service
  • There are a few issues, however, with the export files though
    • If your library is distributed across multiple zips, the albums can reside across multiple zips (and uncompressed folders) which needs to be combined together
    • The media file and the corresponding metadata JSON file can also be in different zips
  • Because of this, when you import these uncompressed folders to another service directly one by one, it might lead to loss of metadata associated with your photos. It might also create incorrect folder/album structure
  • Thankfully, there are ways to fix these issues
    • Metadatafixer
      • All you need to do is add all your zip files to this tool, and it will do its work - combine all the photos/videos and their corresponding metadata together so its readable by any other serice
      • Unfortunately, this is a paid tool that costs $24
    • GooglePhotosTakeoutHelper
      • If you have some tech know how or dont want to pay the $24 above, then this is a library that will come to your rescue
      • It works pretty much the same way as Metadatafixer, except you need to uncompress the zips, and move them to a single folder. You can find all the instructions here
  • Once these fixes are done, you can import the output to any other service or your local storage without the loss of any metadata. This will also work if you want to move your photos from one Google account to another

Moving your memories from Google Photos to Ente

If you want to move away from Google Photos to another service, Ente Photos is the way to go. It offers the same set of features as Google Photos in a privacy friendly product which makes it impossible for anyone to see or access your photos. Migration to Ente is quite easy as well.

  • Download the zip files from Google Takeout as explained above. Uncompress all the zip files into a single folder
  • Create an account at web.ente.io, and download the desktop app from https://ente.io/download
  • Open the desktop, and click on Upload on the top right
  • Click on “Google Takeout”, then “Select Folder”, and then select the folder you have just created in the earlier step
First step of
exporting from Google Photos to Ente Second step of
exporting from Google Photos to Ente
  • Wait for the Ente to upload all the files
  • That’s it

Bonus: Permanently delete memories in Google Photos after exporting

If you’re exporting Google Photos for a backup, and you intend to continue using Google Photos, then you don’t need to delete your memories.

However, there are many use cases for which you should delete your memories from Google Photos after exporting so you dont get charged for it.

  • Moving to a different service like Apple Photos or Ente
  • Using your hard disk for storing memories
  • Partially clearing Google Photos so you can continue using Gmail
  • Deleting a few unimportant memories so you can reduce your cloud costs

Deletion should be fairly straightforward

  • Open the Google Photos app or open the web app at photos.google.com
  • Select all the photos and albums (which includes all the photos inside) you want to delete
Delete photos from Google
Photos after exporting
  • Click on the Delete icon at the bottom
  • This will move your photos to “Bin”, where it will held for 60 days before its permanently deleted
  • If you want to permanently delete these memories right away, go to “Collections” tab from the bottom bar
  • You will see “Bin” on the top right. Click on that. Review the memories there and ensure that you want to delete them permanently.
  • Clicking on the ellipsis on the top right would show an option to “Empty Bin”. Clicking on this would delete all your memories from the bin.
  • Please ensure you have checked everything before tapping on “Empty Bin” as this is a permanent operation without any recovery methods.

You can refer this detailed guide for deletion as well

Monorepo - Our experience

29 October 2024 at 07:00

Nine months ago, we switched to a monorepo. Here I describe our experience with the switch so far.

This is not meant as a prescriptive recommendation, but is rather meant as an anecdotal exposition, in the hope that it might helps other teams make informed decisions.

Unlike most forks in the road, we've travelled both ones. So first I will describe the history that lead up to the change, outlining how we've already experienced the alternative non-monorepo setup too in a similar context, and thus are now well positioned to compare apples to apples.

Platforms and monorepos

Ente began its life half a decade ago. It was a meant as a end-to-end encrypted platform for storing all of Vishnu's personal data, but two things happened: Vishnu realized it was not just him that needed such a thing to exist, and he realized it was going to be a lot of work to build his vision.

So he became a we, and instead of tackling all personal data, the focus was shifted to a singular aspect of it, Ente Photos, to get the spaceship off the ground. To an external observer what looks like a photos app (and that indeed is our concrete current goal) is driven by an underlying vision of the human right to the privacy of all forms of personal data.

Why do I describe all this? Because when viewed in light of this vision, Ente isn't a single app, it is a platform, and storing its code as a monorepo is the ideologically appropriate choice.

This is similar to, say, the Linux kernel. Most people don't realize that the biggest open source project in the world, by most metrics imaginable, the Linux kernel itself, is a monorepo. Even though it is called a kernel, ideologically it really is the full platform, device drivers and all, and the code organization as a monorepo reflects that.

Staying close to the vision of Ente as a platform is not only about the a ideology, but it has practical offshoots too.

For example, a few years ago, we realized that there was no good open source end-to-end encrypted OTP app with cloud backups. So we built one, for our own use, because it was rather easy to build it on top of the primitives we had already created for the photos app.

Today, this side project is the #1 OTP app in the world with the aforementioned characteristics. This might seem like a happy accident, but it isn't, this was always the plan: build a solid platform, then one by one tackle the various bespoke apps we'll need to best handle different forms of data.

Microrepos

So ideologically Ente is best kept as a monorepo. But it wasn't one to start with, due to various historical factors in how the product evolved. What was a hardware device transitioned into software. The server component was closed source until we had the bandwidth to get it audited. Weekend projects like Auth outgrew their reach. Etc.

Let us rewind the tape back to, say, 2 years ago (just to pick a roughly symmetrical split). While we have grown since then in all product aspects including number of developers, we are extremely cautious in adding engineering headcount, so the number of developers hasn't grown that much. Thus it is a similar number of developers working on the same number of products (Ente Photos, Ente Auth) multiplied by the same number of platforms (mobile, web, desktop, server, CLI).

2 years ago, these codebases were spread across a dozen or so repositories.

In February we decided to take time out to finish the task for open sourcing the server side. This was a natural point to also rein in the proliferation of codebases, and we took this as a chance to move to a monorepo.

So, as a similar sized team doing similar work, we've experienced an ~year with a split microrepo setup, and an ~year with the alternative combined monorepo setup.

Summary

If I had to summarize the difference: Moving to a monorepo didn't change much, and what minor changes it made have been positive.

This is not coming as a surprise to us. Most of us didn't care strongly about our repository organization, and overall we weren't expecting much from changing it either. The general vibe was a monorepo might be better, and so why not, and since none of us opposed the choice, we went ahead, but we weren't trying to "solve" anything by the change. We were already happy with our development velocity.

And indeed, overall it hasn't changed much. We're still happy with our development velocity, so it did not get in our way. There have been many small wins however, so for the rest of this post I'll delve deeper into them.

Less grunt work

This is the biggest practical win. There is much less grunt work we have to do.

As an example, take the following pull request. It changed the ML model that is used for computing on-device face embeddings.

Screenshot of the GitHub view of a pull request that changed multiple subsystems in Ente's repository

This change affected (1) the photos mobile app, (2) the photos desktop app, (3) the photos web app, and (4) the ML scaffolding code itself.

In the previous, separate repository world, this would've been four separate pull requests in four separate repositories, and with comments linking them together for posterity.

Now, it is a single one. Easy to review, easy to merge, easy to revert.

Less submodules

Submodules are an irritating solution to a real problem. The problem is real, so a solution is welcome, and submodules are indeed an apppropriate solution, but they're irritating nonetheless.

All this is to say, we appreciate the existence of git submodules as a way to solve practical code organization problems, but we wish we didn't need to use them.

Monorepos reduce the number of places where a submodule would otherwise be required, and is thus a win.

As an example, previously the web and desktop codebases for the Ente Photos app had a submodule relationship. This required a PR dance each time a release had to be made or some other important change pushed to main. All that's gone now. These two interdependent pieces of code now directly refer to each other, and changes can be made to them atomically in the same commit.

More stars

This is the biggest marketing win. Previously our stars were spread out across the dozen or so repositories. If each had a thousand stars, we'd still have 12k stars in total, but because of the way both human psychology and GitHub's recommendation algorithms work, it'd come off as less impactful than a single repository with 12k stars.

Easy

One of the concerns we had going into this was that this might impact our development velocity. We thought we'll have to invent various schemes and conventions to avoid stepping on each other's toes.

Those concerns turned out to be unfounded. We didn't invent anything, waiting to see if the need arose, and it never did. So for an individual engineer in their day to day work, the move has been easy since we didn't ask anyone in the team to change their workflows in any way.

There still are no "repository wide" guidelines, except two:

  1. There should not be any repository wide guidelines
  2. Don't touch the root folder

That's it. Within each folder, or subteam of ourselves, we are otherwise free to come up with whatever organization or coding conventions or what not.

I do realize that maybe the ease for us was a function of both the relatively small size of our team, and the amount of trust we have in each others' competence, and both these factors might not be replicable in other teams.

Long term refactoring

Refactoring across repository boundaries requires much more activation energy as compared to spotting and performing gradual refactorings across folder boundaries. Technically it is the same, but the psychological barriers are different.

As an example, we've already merged together many of our disparate web apps into a similar setup, without needing to make elaborate upfront plans. It happened easily and naturally, since we could see all of them "next to each other" and the opportunities for code reuse become obviously apparent.

Connectedness

This way of "working in a shared space without working in the same folder" has lead to us feeling more connected to each other's work as compared to when, individually or as subteams, we were all committing to separate repositories.

Previously, it was easy to get lost in one's work (in a good way), but sometimes it lead to the feeling of working on a small part without being able to see the whole (in a not so good way).

Now, one can still remain lost in one's own work in the universe of one's own "folder", so that part of the goodness remains. But there are now also additional subtle cues that let us see how what we are doing is part of a interconnected whole. So it's a win win.

What I described might be too abstract, so let me give an example. Everytime I do a git pull, I get to see all the changes that my team mates have been working on. The names of the recently changed files. The number of changes in them. The names of the recent branches. The tags that were recently pushed. All of these individually are very low bit, and imprecise, information vectors, and I don't even consciously look at them.

But what I've found over time that, subconsciously and automatically, these "environmental cues" give me a great sense of "all that is happening around". What features are being worked on, what stage of completion they are at, what bugfixes were pushed, what releases were recently made.

Similar serendipitious information exchange happens when I, say, open the pull requests page and without even intending to, I glance at the stuff others are up to.

The best part is, all of this is subverbal and effortless. Everybody just does their thing, and just by virtue of doing them all in the same shared digital space, arises a sense of awareness and connectedness.

Wrapping up

This is already too long, much longer than I intended to write, so let me stop now.

I could offer tips, but I don't think there is any secret technical sauce that is needed. One thing that had bothered me before the move was how will we manage our GitHub workflows, but that turned out to be trivial since we can scope GitHub workflows to only run on changes to a specific folder.

An engineering-mindset retrospective document would be incomplete without both a Pros and Cons section, but we haven't really found any cons that have effected us so far, so excuse that exclusion.

On a personal level, what I've liked most about the move to our monorepo is the feeling of being part of a juggernaut that is relentlessly rising towards perfection, and has attained an unstoppable momentum. The code I'm writing is not an isolated web component or a goroutine or a little documentation fix, it is now part of this singular platform that will outlive me.

Auth v4

12 October 2024 at 07:00

It's been a few months since our last major update for Auth, and we're very happy about all the love that it has received so far.

From LinusTechTips picking Auth as the best 2FA app to CERN's recommendation, we're grateful to be at the receiving end of praise.

We are now here to showcase v4.0, that comes with some major improvements.

Changelog

Here are the highlights of what has been added since v3.0.

Sharing

Screenshot of Auth's share feature

You can now easily share specific codes with your team, with temporary links that are end-to-end encrypted.

These links will be valid only for the duration that you've chosen. Ente will pre-compute the codes for this duration, and share those within the link, end-to-end encrypted.

Your 2FA secret / seed phrase is not shared, and remains secure on your device.

Notes

Illustration of the Notes feature within Auth

Auth now supports notes within codes.

You can attach any text (upto a maximum of 500 characters), to provide more information about your accounts.

Like with everything else, your notes are preserved end-to-end encrypted.

Trash and recover

Illustration of Auth's Trash and Recover features

You can now trash codes that you don't need.

These items will be moved to a separate section within the app, under the "Trash" header.

You can recover codes from here, in case of accidental deletions, or delete them permanently.

App lock

Illustration of Auth's lockscreen

You can now further protect your codes with a custom lock screen that supports PINs and passwords. So you can protect your co with either of these, instead of your device lock screen.

CLI

Starting this release, Ente's CLI will support accessing your data on Auth.

This means you can go ahead and script a workflow that works for you. You can find more information on using our CLI to export your data from auth here.

Flatpak

Last but not the least, Auth is now officially available on Flathub!


If you like what we're doing, please support us on ProductHunt.

Mozilla grants Ente $100k

23 September 2024 at 07:00

Ente has been accepted into the Mozilla Builders accelerator program!

We're excited for this opportunity to work with some of the best minds that have shaped our internet.

As a part of this program, Ente will receive $100,000 in non-dilutive funding.

Mozilla Builders

Earlier this month, Mozilla invited us to an event in New York, where we had the chance to meet the Builders team. Everyone was smart, kind, and incredibly supportive.

The Builders accelerator was launched by Mozilla to promote independent AI and machine learning. The theme for 2024 is Local AI - AI that runs locally on your devices.

Ente

At Ente, we use Local AI to deliver features like face recognition and magic search, while respecting the privacy of your photos.

We'll now join a cohort of builders pushing technology forward for an AI that is light, private and accessible.

Together

Ente's lil ducky with a fox

Over the next few months, we'll work closely with the Builders team to accelerate Ente's growth and development.

We believe this is an important milestone in Ente's journey. There is much we can learn from Mozilla about building an open source, consumer tech company with positive impact.

We'd like to thank Monica, Liv, John, Stephen, and the rest of the Mozilla team for this opportunity. We look forward to building together.

❌
❌