Published on

How to Manage Large 3D Assets (Models, Textures & Gaussian Splats) with Git LFS on GitHub

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

How to Manage Large 3D Assets (Models, Textures & Gaussian Splats) with Git LFS on GitHub

TL;DR

  • You'll build: a version-controlled repository that can handle very large 3D asset files (e.g. detailed models, high-res textures, or even point-cloud data like Gaussian splats) without bloating your Git history.
  • You'll do: Install and enable Git LFS → Configure tracking for large file types → Push a sample large asset to GitHub → Verify collaborators can pull the asset → Integrate Git LFS into your project's workflow with best practices.
  • You'll need: a GitHub account (with Git LFS enabled), a development PC (Windows/Mac/Linux) with Git & Git LFS installed, and one or two large 3D asset files (like a .fbx model or a big .png texture) for testing.

1) What is Git LFS?

What it enables

  • Version control for huge files: Git Large File Storage (LFS) is an open-source Git extension that lets you version large files (even files of a few GB) in Git repositories. Instead of storing the file contents in Git (which struggles with big binaries), LFS stores a tiny pointer file in Git and keeps the real content in a separate storage system. This allows you to track changes to big assets (models, textures, etc.) just like code.
  • Manage 3D assets in Git workflows: For game developers or 3D artists, this means you can keep high-poly models, 4K/8K textures, and even novel 3D representations like Gaussian splat point clouds under version control. For example, a Gaussian Splat scene file ~77 MB (representing a detailed point cloud) and its 500 MB mesh equivalent can both be tracked with history using LFS. You get to maintain photorealistic assets in Git without the repo becoming unmanageable.
  • Faster cloning & less bloat: Repos using LFS are much leaner. Your team doesn't download every version of every large file on clone - only the files needed for the current checkout are pulled on-demand. This lazy fetching means faster clone/pull operations, since you skip gigabytes of historical data you don't need immediately. It keeps your repo size small by storing big files externally, improving performance.

When to use it

  • Game dev and 3D projects: Use Git LFS when your project involves large binary assets - e.g. Unity/Unreal projects with high-res textures, 3D model files, audio clips, or cinematic videos. It's invaluable for game developers who hit GitHub's normal file limitswith big textures and models. If you've ever seen the "file is 150 MB; exceeds GitHub's 100 MB limit" warning, that's exactly when LFS is needed.
  • Visualizations, AR/VR, photogrammetry: If you work with 3D scans or Gaussian Splatting outputs (which produce large point cloud-like datasets), LFS lets you version those artifacts. For instance, you might capture a 3D scene as Gaussian splats (tens or hundreds of MB per scene) - LFS will handle those .ply or splat files so you can collaborate on them. Similarly, architecture or VFX projects with huge textures or HDR images benefit from LFS.
  • Data sets and media in code repos: Outside of 3D, anytime you have big media files (HD videos, large datasets, audio samples) alongside code, consider LFS. It's ideal when you want everything in one Git repo for convenience, but the binary blobs are too heavy for Git's normal operation, as Git repositories are designed to handle text files efficientlyand storing large files in Git can make it difficult to collaborate effectively. LFS shines for team workflows where everyone needs the latest binaries but you don't want Git itself to choke on them.

Current limitations

  • Storage and bandwidth limits: Git LFS is not unlimited. For example, GitHub's free tier gives 2 GB total LFS storage and 1 GB/month bandwidth for LFS transfers. Large game projects or heavy scan data can exceed this (think dozens of 4K textures or many scene captures), so you may need to purchase additional LFS quota or use an alternative storage solutionif you blow past 2 GB. Also, individual Git hosting services often hard-block files over 100 MB in normal Git - which is why LFS is needed once files approach ~100 MB.
  • No binary diffs or compression magic: LFS doesn't diff the content of binaries - each version you commit is stored in full on the LFS server. This means if you commit a 500 MB model 5 times, that's ~2.5 GB used. It saves your Git repository from that bulk, but your LFS storage usage grows with each version. LFS also won't auto-compress your files beyond what you do yourself (you might still use .zip or other compressions to reduce file size before tracking, if appropriate).
  • Merge conflicts on binaries: Git cannot merge changes inside binary files. If two people edit the same 3D asset, you'll hit a conflict that you must resolve manually (usually by choosing one version). To mitigate this, Git LFS introduced a file locking feature: team members can "lock" a file before editing to signal exclusive access, which helps resolve conflicts in large binary files. Locking prevents others from pushing changes to that file until it's unlocked. It's optional but very useful for large binaries that can't be merged. Remember, locking requires using LFS-aware commands (git lfs lock/unlock) and is supported by GitHub and others - it helps avoid the "two people edited the model at once" problem.
  • All contributors need LFS setup: Everyone cloning the repo (and your CI servers) must have Git LFS installed. If not, they'll end up with tiny pointer files (text files with an SHA pointer) instead of the real assets. In other words, without LFS, a file like model.fbx will appear as a ~130 byte text file that simply lists the real file's hash and LFS info. This is by design, but it means new contributors must remember to install LFS. It's usually a one-time setup (and we'll cover it), but it's a "gotcha" if someone forgets - the project will have missing assets until they run git lfs pull.

2) Prerequisites

Access requirements

  • GitHub account: You'll need a GitHub account (free is fine). Git LFS support is built into GitHub; you don't have to enable anything special for basic use. Just be aware of the 2 GB storage cap on free accounts - if your assets will exceed that, plan to upgrade or use an alternate Git hosting that allows more.
  • Git LFS installed: Install the Git LFS extension on your development machine. On Windows or Mac, you can download the installer from the official site or use a package manager (brew install git-lfs on Mac, choco install git-lfs on Windows). On Linux, install via your distro's package manager or download the binary. After installation, run git lfs install once to configure Git to use LFS and set up global settings and hooks.
  • Git client access: If you use a Git GUI or IDE (VS Code, GitHub Desktop, SourceTree, etc.), check that it supports LFS. Most modern tools do. SourceTree (by Atlassian) even bundles Git LFS and was built with LFS in mind. If using Git from the command line, ensure you have Git 2.x+ (for best compatibility) and that your normal authentication (SSH keys or PAT for HTTPS) is set up - LFS will use the same credentials to push/pull large files.

Platform setup

  • Local Development Environment:
  • Operating System: Git LFS works on Windows, macOS, and Linux. Ensure you have a terminal or Git client set up. (For Windows, Git Bash or PowerShell with Git is fine; on Mac/Linux, use Terminal.)
  • Git installed: Install Git itself if not already. (This is separate from Git LFS.) Many OSes have it pre-installed or available via package managers.
  • Git LFS on all machines: Every team member or build agent needs to install Git LFS. This includes CI servers - for example, on GitHub Actions, use the checkout action with LFS support or run git lfs install && git lfs pull in your pipeline. Missing this can cause builds to not find needed assets.
  • GitHub repository:
  • Create or use a repo: Have a GitHub repo ready for your project. If starting fresh, create a new repository on GitHub (private or public as you prefer). If using an existing repo that already has large files committed, note that simply adding LFS now won't retroactively shrink your repo - you'll need to migrate existing large files out of history (we address that in Quickstart B). For a new repo, no special settings are needed on GitHub; LFS is supported by default on all repositories.
  • Permissions and collaborators: If it's a team project, ensure all collaborators have write access to the repo and are aware to install Git LFS. There's no separate permission for LFS on GitHub - if someone can push to the repo, they can push LFS files as well. However, if you're in an organization with enforced policies, make sure LFS usage is allowed and not hitting any company rules or firewall issues (LFS uses HTTPS to transfer files).

Hardware or environment

  • Storage space: Large files require disk space. Ensure your local machine has enough space for the assets you'll be adding (and multiple versions, if you will keep history). For example, adding a couple of 500 MB model files will obviously need >1 GB free locally. Likewise, if your CI pipeline pulls these, ensure the runner has sufficient storage.
  • Network considerations: Pushing and pulling hundreds of MBs to Git LFS will use bandwidth. A stable internet connection is recommended, or at least plan around the time it takes to transfer big files. Git LFS transfers can be interrupted and resumed, but a flaky connection might cause pushes to fail and require a retry.
  • Optional - LFS alternatives for huge teams: If you expect many large files or sizes well beyond a few GB, be aware of alternatives like artifact storage or dedicated game asset versioning systems (e.g. Perforce). Git LFS works best when your large file set is in the gigabyte range, not tens of GB. It can handle multi-GB files, but pushing 50 GB of assets through GitHub may be slow and costly. In those extreme cases, consider splitting assets to cloud storage and only keeping pointers/links in your repo.

3) Get Access to Git LFS

  1. Sign in to GitHub & create a repo: Log in to your GitHub account and create a new repository for testing Git LFS (or use an existing repo where you want to enable LFS). For a new project, you can initialize it with a README, or you can add files later. If you already have a repo, make sure you have it cloned locally. There's no special “LFS repo” type - any repo can use LFS.
  2. Install Git LFS client: If you haven't already, download and install Git LFS from the official site (git-lfs.com) or via package manager. Once installed, open a terminal and run: git lfs install. This needs to be done once per machine (it sets up a global config and hooks). If successful, it will output "Git LFS initialized" and set up a pre-push hook to handle LFS file uploads.
  3. Enable LFS tracking in your repo: Navigate in terminal to your repository folder. Decide which file types you want LFS to handle. Common 3D asset types include .fbx.obj.png.jpg.mp4, and perhaps custom extensions like .gsplat (if we had a file format for Gaussian splat data). Use git lfs track "<pattern>" for each type. For example: git lfs track "*.fbx" and git lfs track "*.png". This command will add entries to a .gitattributes file in your repo, telling Git to use LFS for those patterns. Don't forget to commit the .gitattributes file to save these tracking rules.
  4. (Optional) Adjust GitHub LFS settings: By default, your repo will accept LFS files. If you're the repo owner, you can check Settings → Git LFS in the GitHub web UI to see usage, or to enable/disable LFS. It's usually on by default. In enterprise setups, an admin might need to enable LFS for the server or increase quotas for your project. For GitHub.com, 2 GB is free and you can purchase more in Settings → Plans if needed.
  5. Obtain credentials (if prompted): Git LFS uses the same auth as Git. Typically, if you authenticated Git operations via SSH, LFS will piggy-back on an HTTPS token. The first time you push an LFS file, GitHub might prompt for credentials to the LFS endpoint. If you have 2FA, you may need a Personal Access Token with repo scope. Ensure you have a token ready if using HTTPS. (If you're using SSH remotes, GitHub still does LFS transfers over HTTPS internally, but your saved credentials in Git should handle it.) In most cases, you won't need to do anything special here - just be aware if a credentials popup or error appears, logging in with your GitHub credentials or token will resolve it.
  6. Done when: your local repo is LFS-enabled. This means you have run git lfs install, you have a .gitattributes file listing large file patterns, and you're ready to add large files. You should also see in GitHub settings (or via git lfs ls-files command) that no files are tracked yet (since we haven't added one). Now you can proceed to actually add and test large assets in the repo.

4) Quickstart A - Add a Large 3D Asset in a New Repo (Basic Workflow)

Goal

Set up a fresh repository with Git LFS and verify that pushing a large asset works. We'll walk through adding a big file (e.g., a 3D model or texture) to a new GitHub repo using Git LFS, and then confirm that everything is stored and retrieved correctly. By the end, you should have a sample app/repo on GitHub that uses LFS for at least one large file, and you'll confirm that others (or you on another machine) can clone it successfully.

Step 1 - Get the sample repository

  • Option 1: Create a new repo on GitHub and clone it. On GitHub, click “New Repository”, give it a name (e.g. LFS-3D-Test), and do not add any large files yet. Clone this empty repo to your local machine using git clone. This will be our sandbox.
  • Option 2: Use an existing example. (If you prefer, you could fork an existing repo that uses LFS, such as Apress's repo-with-large-file-storage example. If you do this, make sure to install LFS before cloning. However, for learning, it's better to start from scratch so you understand each step.)

After this step: you should have a local Git repository ready. If you run git status, it's either empty or just has minor files (like README). No large files added yet.

Step 2 - Install dependencies (Git LFS) in the repo

  • Initialize LFS in this repo: In your repo directory, run git lfs install (if you didn't already globally). This adds LFS hooks to this specific repo. It might say "Updated pre-push hook" indicating LFS is now active for this repo.
  • Track file types: Decide on a file type to test. Let's say we have a big model.fbx (Filmbox format) and a texture.png. Run:

bash

Copy code

git lfs track "*.fbx" git lfs track "*.png"

This will create (or update) a .gitattributes file with those patterns. (The .gitattributes entry for *.fbx will look like: *.fbx filter=lfs diff=lfs merge=lfs -text.)

  • Confirm tracking: Run git lfs ls-files. At this moment, it might show nothing tracked yet (since we haven't added files). But you can run git lfs track with no arguments to list patterns. You should see your patterns listed as “tracked” in the output.
  • Commit the attributes: Stage the .gitattributes file (git add .gitattributes) and commit it: git commit -m "Configure Git LFS for .fbx and .png files". This ensures the tracking rules are in place in your repo history before adding the large files.

Step 3 - Add a large asset file

  • Prepare a large file: Take an existing large 3D asset from your system. For example, use a high-poly model file Dragon.fbx (~150 MB) or a big texture Skybox.png (~120 MB). If you don't have a real file that big, you can create a dummy file for testing (e.g., use a tool or script to generate a 100 MB file). The key is that it exceeds GitHub's 100 MB normal limit to really test LFS.
  • Add the file to the repo: Copy the large file into your repository folder (e.g., put Dragon.fbx into a Assets/ directory in the repo). Now, stage it with git add Assets/Dragon.fbx. Git LFS will intercept this add. Instead of adding 150 MB to Git's index, it adds a small pointer file. You can actually open the staging area (or after commit, open the repo file in a text editor) to see the pointer's content - it will look something like:

text

Copy code

version https://git-lfs.github.com/spec/v1 oid sha256:abcd1234... (a long hash) size 150000000

That's the LFS pointer.

  • Commit the file: Run git commit -m "Add large model via LFS". The commit will be created quickly (notice it's not trying to delta-compress 150 MB - LFS already handled it). At this point, the large content is stored in your local LFS cache (likely under .git/lfs/objects). The Git history just knows about the pointer.

Step 4 - Push to GitHub (uploading LFS content)

  1. Push the commit: Run git push origin main (assuming your branch is main). Git will first push the regular commit (which contains the pointer), then the Git LFS pre-push hook will trigger an upload of the actual large file to GitHub's LFS storage. The output will show something like: Uploading LFS objects: 100% (1/1), 150 MB... followed by a success message. The first time, you might be asked to authenticate for LFS if your credentials aren't saved.
  2. Verify on GitHub: Go to the GitHub web UI for your repo and find the file (e.g., navigate to Assets/Dragon.fbx). GitHub won't show a preview (for binary, it usually shows a download option). You should see a message that the file is stored with Git LFS, along with a pointer or a link to download. This confirms that GitHub recognized it as an LFS file. If something went wrong and it wasn't stored as LFS, either the push would have been rejected (if >100 MB without LFS) or the file would appear as binary garbage in the repo (which we don't want).
  3. Check repo settings: In GitHub, under Insights > Traffic or Settings, you might see LFS usage. Also, GitHub might show an icon or text on the file's page indicating LFS usage. Everything should indicate the large file is not in git history but in LFS.
  • Done when: Your large file is now in the GitHub repo via LFS. The commit on GitHub is small (just pointer text), and the big content resides in GitHub's LFS storage. You should not see any warnings about file size. If you inspect your local repo's .git/lfs/objects, you'll see a folder containing the file's content (proving LFS stored it).

Step 5 - Clone and test the LFS file retrieval

Now we'll simulate a teammate (or a CI server) pulling this repository to ensure they get the large asset:

  • Fresh clone on another machine (or location): If you have a second computer, or simply a different directory, try cloning the repo anew: git clone https://github.com/YourUser/LFS-3D-Test.git LFS-3D-Test-clone. Ensure that Git LFS is installed/enabled in that environment first (you can run git lfs install just in case).
  • Observe clone behavior: During git clone, after the normal Git objects are downloaded, you should see output like: Downloading Assets/Dragon.fbx (150 MB). Git LFS will fetch the actual file as part of the checkout. The clone completes with the large file present in the working directory. If you open the file on disk, it should be the real content (e.g., you can open the model in a 3D viewer or check that the PNG opens in an image viewer).
  • Alternative zip download test: If someone downloads the repo as a ZIP from GitHub (without using Git), they will get a ZIP containing the pointer file (since GitHub's ZIP export doesn't include LFS content by default). This is important: such a user would have to manually download the file from the file's page on GitHub. (GitHub provides a download link for LFS files in the UI.) This is more of an edge-case (usually developers will clone properly), but it's good to know. In our test, since we cloned with LFS, we got the file automatically.

Verify

  • Repository shows LFS pointers: On GitHub, verify that the added file is tracked by LFS. The file listing should indicate LFS usage, and the repo's Git history should not be bloated by the file size. For instance, check that the repository size on GitHub remains small despite the large file (it might still count it separately in usage stats but not in git object size).
  • Cloned working copy has real files: In the fresh clone, open or list the large file. It should be the full-size asset (e.g., opening the image shows the actual picture, the model can be loaded, etc.). This confirms that Git LFS delivered the content. If instead you see a pointer text in place of the file, then LFS didn't pull it correctly.
  • No large-file push errors: The git push went through without any “file too large” errors. If you saw an error, it means something wasn't set up (for example, if you forgot to run git lfs track before adding, Git might have tried to push the binary normally and failed at 100 MB).
  • Subsequent changes track correctly: (Optional) Modify the large file slightly (if feasible) or add a second large file, commit, and push. Verify that those updates also go through LFS. This ensures your .gitattributes is doing its job for all future additions.

Common issues

  • "File X is 150.00 MB; exceeds GitHub's limit" - If you got a warning or error like this on push, it means the file was not caught by LFS. Likely causes: you didn't run git lfs track before adding the file, so Git treated it as a normal file. The fix is to remove that commit or file from Git, run git lfs track, then add and commit again. (If the commit was already pushed, you'll need to delete it from history - easier to catch this before pushing.)
  • Pointer file seen instead of real file - If after cloning (or pulling) on another machine you open the file and see text starting with “version https://git-lfs.github.com/spec/v1…”, that means the LFS content wasn't downloaded. This happens if Git LFS isn't installed or if the LFS fetch was skipped. Fix: Ensure you ran git lfs install. If you forgot during clone, you can run git lfs pull now to download the files. In GitHub Actions, this can happen if you don't enable LFS in the checkout action (use with: lfs: true).
  • Push fails due to LFS auth - If git push failed at the LFS uploading stage with an authentication error, your Git credentials might not have propagated to LFS. For example, if using SSH remote, you might need to have a personal access token set up for LFS. One workaround: use a HTTPS remote with a saved credential. Or configure Git to store credentials and try again. On CI, ensure you provide an auth token that allows LFS pushes.
  • Exceeding quota or bandwidth - If your file pushed but then further pushes fail with messages about quota, you may have hit the 1 GB/month bandwidth limit or 2 GB storage limit. The Git LFS push output or subsequent attempts will tell you if you're over the limit. The fix is to purchase more LFS data from GitHub or reduce what you're storing (delete some LFS files in the repo - and possibly use git lfs prune to remove them from the server if they're no longer referenced).
  • Large file not tracked in earlier commit - If you had earlier commits (in this or another repo) with giant files that were not in LFS, your repository might still be huge and slow. Simply adding LFS now doesn't retroactively fix those commits. In Quickstart B below, we discuss how to migrate existing history to LFS. In the meantime, if you face this, you might see very slow clones or big repo size. The immediate workaround is to use git lfs migrate import --include="*.ext" to rewrite history (warning: this alters commit history). Always back up or use on a branch when doing that.

5) Quickstart B - Migrate an Existing Project to Use Git LFS

Goal

Apply Git LFS to an existing project that already has large files in its history. This is a common scenario: for instance, you have a Unity project where you initially committed some .fbx models and .png textures to GitHub normally, and now your repo is huge or pushing is painful. The goal is to retrofit LFS so that future versions use LFS, and ideally clean up the past history to remove the large blobs from regular Git storage.

> Note: Migrating an existing repo's history is an advanced task and can rewrite commit history (affecting collaborators). If your project is early or all collaborators agree, this is fine. If not, you can choose to only use LFS for new commits going forward and leave old history as-is (with the cost of a large repo). We'll outline both approaches.

Step 1 - Analyze your repository for large files

  • Scan the repo history: Use a tool like git-sizer or git rev-list to find large files in your commit history. You can also check GitHub's repo insights or clone and run git lfs migrate info (which lists what would be migrated). Identify which file types or specific files are the culprits (e.g., .fbx.png).
  • Communicate with team: If you plan to rewrite history to migrate these files into LFS, inform your team. They will likely need to re-clone the repository after the migration, since history will change. Coordinating this will avoid confusion.

Step 2 - Enable LFS tracking for future commits

  • Even before dealing with history, set up LFS tracking so that new additions use LFS. Follow the steps from Quickstart A: run git lfs install in the repo, and git lfs track "*.fbx" (and other patterns for your large files). Commit the .gitattributes. This ensures that from this point on, any newly added or modified large files will go to LFS.
  • If the large files are already in the repo (in past commits), just tracking them now won't automatically remove them from history, but it will catch any re-addition. For instance, if you edit an existing large model and commit, with LFS tracking it will store the new version in LFS (though the old version remains in Git history until purged).

Step 3 - Migrate existing files to LFS (history rewrite)

  • Backup and branch: Before migrating, it's wise to create a backup of your repo or do this in a separate clone. Also, ensure you have no uncommitted work. You might perform the migration on a new branch (like lfs-migrate) which you'll force-push to replace main.
  • Use git lfs migrate: Git LFS provides a command to rewrite history and move files into LFS. For example:

bash

Copy code

git lfs migrate import --include="*.fbx,*.png" --include-ref=refs/heads/main

This will go through the history of main and for every commit, any file matching those patterns will be replaced by an LFS pointer, with the file data added to LFS. It effectively reconstructs commits. After running, you'll get a rewritten history where those files are no longer in Git's pack files.

  • Verify and push: After migration, check git status. You might need to force-push (git push origin +main) because history changed. Make sure all collaborators have stopped committing during this process. Once pushed, the remote will now have LFS objects for those files, and the commit SHA history will be different.
  • Inform team to re-clone: Anyone who had the old clone should re-fetch and reset (or simply reclone fresh) to avoid confusion. The old commits with large files will no longer be in the main history. (They might still exist in the reflog or as dangling blobs on the server until pruned by GitHub, but not in any branch.)
  • Done when: your repository's current branches no longer contain huge files in regular Git. All large assets are tracked by LFS as per .gitattributes. The repo's Git clone size should drop (maybe dramatically, if you removed big blobs). The LFS usage on the server will increase accordingly (you effectively moved data from repo to LFS storage). At this point, normal development can continue, with all new large file changes going to LFS.

Step 4 - Update build/test environments

  • After migration, ensure any automated build or CI scripts still work. If your CI pipeline does a shallow clone or specific ref checkout, verify it fetches LFS. For GitHub Actions, add:

yaml

Copy code

  • uses: actions/checkout@v3 with: lfs: true

This ensures LFS files are pulled. For other CI systems, you may need to run git lfs pull after checkout.

  • If you have submodules or sub-repos, consider if they also need LFS tracking adjustments.

Step 5 - Clean up old data (optional)

  • The migrated history on the server will leave behind the old LFS-untracked blobs unless you contact the host or they auto-clean. For GitHub, over time unused data might be garbage-collected. If your repository settings show an enormous size even after migration, you may need to open a ticket with GitHub support to remove the dangling objects, or use the GitHub Garbage Collection workflow (if you have admin rights on an enterprise instance).
  • On your local, you can run git lfs prune to remove any LFS files in your local cache that are not referenced by current commits. This frees space on your machine, especially after migrating a lot of files.

Verify

  • Repository size reduced: Check the repository's size on GitHub (Settings -> Insights). It should be significantly smaller in terms of Git storage. The LFS usage will correspondingly show the data. For example, if you migrated 1 GB of models, your Git repository size might drop by 1 GB, and your LFS usage will go up by 1 GB.
  • Cloning is faster: Try cloning the repo anew. It should complete much faster than before, since it's not pulling down all historical versions of large files. Only the latest LFS files are downloaded on checkout. This is a key benefit: one user reported their 3D model repo went from interminable clone times to just seconds after moving large assets to LFS (since the initial clone only grabbed small pointers and a few current assets).
  • No more push warnings: When adding new large files or updating them, you no longer get warnings about the 100 MB limit. Instead, LFS seamlessly handles the push. (If you do still get warnings, maybe a file extension wasn't covered by .gitattributes - add it and commit.)
  • Team workflow intact: Ensure that you and others can still make new commits, merge branches, etc., without issues. If history was rewritten, double-check that important branches were migrated (you might need to migrate each branch separately or just main and have others rebase). Once everyone is on the new history, the day-to-day usage should be the same as any Git project, just with LFS behind the scenes.

Common issues

  • Git history rewrite woes: If someone didn't get the memo and continues on an old clone, they might try to push and get divergent history errors. The solution is to have them fetch the new commits and reset or re-clone. History rewrite is disruptive but once managed, it's fine.
  • Migrating tags and branches: The git lfs migrate example above only included main. If you have important tags or other branches that also contain large files, you need to migrate those too (e.g., -include-ref=refs/heads/* to do all heads). Otherwise, those old refs still hold big files. If you left some history unmigrated (like an old release tag) and it's not needed, you can delete those refs to save space.
  • Locked files after migration: If you had set up LFS file locking, note that locks might be lost or need to be re-established after rewriting history (since technically those files are new objects now). Coordinate with your team on any ongoing locks.
  • Git LFS migrate tool limitations: The git lfs migrate command has many options and some limitations. For example, it might not migrate already LFS-tracked content to a new remote if you're switching servers. Always test migration in a dry run (there's a -dry-run flag) to avoid surprises. In worst case, tools like BFG Repo Cleaner can strip out large files and you then re-add them via LFS manually.

6) Integration Guide - Best Practices for Using Git LFS in Your 3D Asset Workflow

Goal

Integrate Git LFS seamlessly into your day-to-day development of a 3D project (game, simulation, AR app, etc.). Now that the basic setup is done, this guide will ensure you can work efficiently: you'll set up your workflow so large files are handled automatically and reliably across your team, without nasty surprises. We'll cover project structure, automation, and how the pieces (your app/project files and LFS) interact.

Architecture

  • Git client (Dev machine) → Git LFS client → Remote Git repo + LFS storage (GitHub) → Teammate's Git LFS client → teammate's working copy. In practice, when you commit a change to a 3D model in Unity, for example, and push, the flow is: your Git LFS client uploads the model to GitHub's LFS store, and your commit (with a pointer) goes to the Git repo. When your teammate pulls, their Git LFS client sees the pointer and downloads the model from LFS store. This all happens mostly transparently after initial setup. The important part is making sure everyone has the LFS client active and the .gitattributes rules up to date in the repo.
  • Repository structure considerations: Organize your repo so that large assets are in predictable locations, making it easier to track them. For example, put all game artwork in an /Assets or /Art folder. This way you might simply track Assets/** in LFS if most files there are binaries. Keep source code separate (it remains in normal Git). This separation can also help if you ever need to export or migrate assets.
  • Asset pipeline integration: If your project uses a build pipeline or source control integration (like Unity's Cloud Build or Unreal's version control integration), ensure it supports LFS. Many tools now have settings for Git LFS or at least respect the .gitattributes. The key architecture point: treat LFS assets as first-class project files, just handled by a different storage under the hood.

Step 1 - Ensure all team members have Git LFS installed

This was mentioned before, but it's worth reiterating in an integration context: if a single collaborator doesn't set up LFS, they can inadvertently commit a large file without LFS or fail to retrieve a file, causing project inconsistencies. Add a note in your project's README or onboarding guide: “Run git lfs install after cloning, and make sure Git LFS is installed.” You might even include a check in a setup script. Definition of done: every active contributor's environment is LFS-ready.

Step 2 - Add LFS tracking rules to the repo (and keep them updated)

By now, you likely have a .gitattributes with patterns like *.fbx filter=lfs .... As your project grows, update this file for any new large file types introduced. For instance, if you start adding .mp4 videos for cutscenes, add *.mp4 to LFS tracking before committing them. The .gitattributes is versioned, so it's easy to update: just edit and commit. Keep it comprehensive - it's better to track a file type that might not always be huge than to miss one that is.

  • Definition of done: All relevant binary file types used in the project are listed as LFS tracked. This prevents the “oops, I committed a 200MB file without LFS” scenario going forward.

Step 3 - Automate where possible (Git hooks)

To further integrate LFS into your workflow, consider automation:

  • Pre-commit hook for large files: You can set up a git pre-commit hook that checks file sizes and auto-adds LFS tracking if a file is over a threshold (say 50 MB). For example, the REALDRAW team shared a script that scans staged files and if any exceed 100MB, it runs git lfs track on them automatically. This can prevent those accidental huge commits. Add such a hook to your repo (in .git/hooks/pre-commit, and ensure devs install it or use a tool like lefthook to distribute hooks).
  • Post-merge or post-checkout hook: These could remind devs to run git lfs pull if needed, but in general LFS does this on checkout automatically. If you use Git submodules or some custom workflow where LFS content might not auto-download, a hook can ensure the latest LFS files are fetched after switching branches or pulling new commits.
  • Continuous Integration automation: If your CI pipeline produces large artifacts (say, it builds a 3D model or bakes lighting data to images that you then commit), script the CI to use LFS for those outputs. E.g., your CI job can do git lfs track "*.exr" before committing newly generated EXR files, so they don't pollute history.
  • Definition of done: Common operations (commit, merge, etc.) will either automatically handle LFS or clearly warn, reducing human error. If a team member tries to commit a 200MB file without LFS, the hook should catch it and fix it or at least warn them.

Step 4 - Establish a connection workflow (for large files edits)

In a team setting, especially with artists and developers collaborating:

  • Locking protocol: If you use LFS file locking, decide on a protocol. For example, before an artist starts tweaking a model, they run git lfs lock Assets/Dragon.fbx and push that lock (GitHub's LFS servers will register it). When done, they run git lfs unlock. Communicate locks via your workflow tool or Slack if needed ("I've locked the dragon model to update animation"). This prevents others from editing it in the meantime - attempts to push changes to a locked file will be rejected by the server, as it acts as an advisory mutex. Not every team needs this (if conflicts are rare), but it's good for high-risk assets.
  • Working with branches: If multiple feature branches might each modify large assets (say two branches both change the same texture), merging will be painful. Coordinate such changes or use locks/communication to avoid it. Alternatively, plan to regenerate one of the versions if conflict arises (like re-export a texture). Git LFS won't solve the merge conflict, but it at least makes sure you're aware of them.
  • Clone shallow or exclude heavy content if needed: In some cases, a developer might not need all large assets (maybe a backend developer who just works on code). They can choose not to download LFS files by cloning with GIT_LFS_SKIP_SMUDGE=1 (which skips auto-downloading LFS content). They'll get pointer files and can save bandwidth. Document this technique if it fits your team (and how to manually fetch specific files with git lfs pull when needed). This way, LFS provides flexibility: not everyone has to pull everything if not needed.

7) Feature Recipe - Example: Updating a Texture Asset in LFS

Goal

Demonstrate a typical use-case step by step: updating a high-resolution texture in your project using Git LFS. For instance, you have a Skybox.png (50 MB) tracked in LFS, and your artist provides an improved version (60 MB). We'll replace the texture and ensure the new version is versioned, while the old version remains accessible in history if needed. This showcases how LFS handles versioning of binary assets.

UX flow

  1. Ensure LFS is set up: Developer confirms .png is in .gitattributes and LFS is active (this should already be true from initial setup).
  2. Replace the file: Overwrite the old Assets/Skybox.png with the new file (same name). In Unity, for example, you'd import the new texture, effectively modifying the file.
  3. Git status: Git will see Skybox.png as modified.
  4. Commit the change: Developer commits with a message like “Update skybox texture with new version”.
  5. Push: On push, Git LFS uploads the new texture to the server. The old version stays in LFS storage associated with the old commit (so you can still retrieve it from history).
  6. Teammate pulls: A teammate pulls the latest changes. LFS downloads the new Skybox.png automatically. Their game now uses the updated skybox.
  7. Optional rollback: If the new texture had an issue, one could easily checkout the previous commit or use git show <old-commit>:Assets/Skybox.png > old.png to retrieve the prior version. LFS makes sure that old 50 MB file can still be fetched if needed (provided it hasn't been pruned).

Implementation checklist

  • File type tracked: Verify .png is in LFS tracking. (Check .gitattributes or run git lfs status which lists tracked patterns and any large files in staging.)
  • Replace or edit asset: Copy in the new file with the same filename. (If you add a new file name, just ensure it's in a tracked folder or matches the pattern. If not, add the pattern and commit .gitattributes before adding the file.)
  • Stage and commit: Use normal Git commands: git add Assets/Skybox.png (stage the new version, which actually stages a pointer). Then git commit -m "Update skybox texture".
  • Push and verify: git push. The push output should indicate the LFS upload, e.g., Uploading LFS objects: 100% (1/1), 60 MB. On GitHub, check the file's history - you'll see two versions of the pointer (with different OIDs). GitHub might show the file size changes in commit diffs as just pointer text differences.
  • Teammate pull: When others run git pull, they will get the new commit and Git LFS will download the 60 MB file. Verify the teammate's Assets/Skybox.png is updated (and their game now has the new skybox).
  • Old version access: Test that you can still retrieve the old version if needed: e.g., run git log -p -S Skybox.png to find the old commit, then git checkout <old_commit_hash> Assets/Skybox.png (this will put the old version in your working tree). LFS will pull that old 50 MB file from the server. You could then commit it to a revert branch or just compare quality, etc. When done, checkout back to latest.

Pseudocode (for an automated script example)

Imagine you have a script to update an asset and log the change - for illustration, here's how it might look in a pseudo-shell:

bash

Copy code

function updateTexture() { local FILE=$1 local COMMIT_MSG=$2 if [ ! -f "$FILE" ]; then echo "Error: $FILE not found." return 1 fi # Ensure file is LFS tracked git lfs track --inspect "$FILE" > /dev/null 2>&1 # (This is a pseudo-command; actual checking would parse .gitattributes) if [ $? -ne 0 ]; then git lfs track "$FILE" git add .gitattributes echo "[LFS] Tracking $FILE" fi git add "$FILE" git commit -m "$COMMIT_MSG" git push } # Usage: updateTexture "Assets/Skybox.png" "Update skybox texture to new HDR version"

This script checks if the file is tracked, adds tracking if not, then commits and pushes. In practice, you'd rely on .gitattributes being correct, but this shows the flow.

Troubleshooting

  • Texture not updating for teammate: If a colleague pulls but doesn't see the new texture, likely their LFS didn't fetch. Have them run git lfs pull. If that still doesn't get it, check that their .gitattributes is up to date (maybe they didn't pull the commit that added the LFS tracking rule). Ensure everyone is on the same page with LFS setup.
  • Conflict on binary file update: If two people accidentally both updated Skybox.png on different branches, Git will flag a merge conflict (both modified the file). Because it's binary, Git can't merge automatically. The resolution is to pick one version or manually merge (perhaps using an image editing tool to combine changes if that makes sense). Use git lfs lock next time to avoid this scenario. On the command line, resolving the conflict means deciding which file to keep, then committing that. The losing version can still be found in that branch's history if needed.
  • File appears unchanged but you suspect different content: Sometimes, if an update is subtle and file size doesn't change much, it might not be obvious if the file updated. Use git lfs ls-files -l which lists LFS files with their OIDs; if the OID (hash) changed, the content changed. Or use an external diff tool for images (there are image diff tools) - Git LFS itself won't diff binary content.
  • Ensuring data integrity: Git LFS uses SHA-256 hashes (OID) to ensure the file isn't corrupted during transfer. If there's a corrupt download or wrong file, LFS will usually detect it via hash mismatch. If you ever see an error about OID mismatch on checkout, it means the file may be corrupt or wrong. Running git lfs fetch --all and git lfs checkout can redownload any problematic files. Such issues are rare but it's good to know the recovery.

8) Testing Matrix

| Scenario | Expected outcome | Notes |

| --- | --- | --- |

New clone with LFS installed | All large files download on checkout ✅ | Clone should show LFS download progress atlassian.com. If some files don't appear, run git lfs pull. |

Clone without LFS installed | Large files appear as pointers (text files) ⚠️ | User must install Git LFS and run pull to get actual files. Educate team about this step. |

Adding >100MB file without LFS | Push rejected by GitHub ❌ | GitHub blocks files over 100 MB in normal Git medium.com. Must track with LFS and recommit. |

Exceeding free LFS quota (2GB) | Push blocked or further LFS pulls fail ❌ | GitHub will prevent adding more LFS objects beyond quotablog.logrocket.com. Solution: purchase more data or remove unused LFS files. |

Multiple large files on different branches | Only files for checked-out branch are downloaded ✅ | LFS fetches lazily per checkout atlassian.com atlassian.com. Switching branches will download that branch's LFS files as needed. |

File lock contention | Second user can't push to locked file ❌ | If User A locks a file, User B's attempt to push changes to it will be denied until unlocked stackoverflow.com. This prevents conflicting edits. |

CI pipeline build | CI successfully pulls LFS files ✅ | Ensure CI uses git lfs pull or equivalent. If not, builds may fail due to missing assets. |

Removing a large file | File is removed from repo, LFS storage remains (until pruned) | After git rm and commit, the file no longer in latest history. To free space, run git lfs prune and perhaps remove it from server if needed. |

Fork or merge from external | LFS files retained, but ensure fork also has LFS enabled ✅ | Forking a repo with LFS on GitHub keeps LFS pointers, but the data is still in original LFS store unless the fork also uploads (GitHub handles this). Merging back works as normal provided LFS pointers align. |


9) Observability and Logging

To keep tabs on how LFS is performing in your project, consider:

  • Git LFS logs and status: Use git lfs status to see if any large files are awaiting upload or download. After a fresh pull, git lfs ls-files will list files managed by LFS along with their status. This is a quick way to see “are all my LFS files fetched?”.
  • Transfer logs: By default, LFS prints progress during uploads/downloads. If you need more detail (for debugging slow transfers, etc.), set the environment variable GIT_CURL_VERBOSE=1 or GIT_TRACE=1 before running Git commands to see the HTTP requests. This can help identify if you're hitting rate limits or other issues.
  • Repository LFS usage metrics: On GitHub, go to Settings → Git Large File Storage to see how much storage and bandwidth you've used. Monitor this especially on a growing project. It will show how close you are to the quota. For example, if you see 1.9 GB used of 2 GB, it's time to clean up old files or increase the plan. Also, GitHub will email the repo owners when nearing or exceeding LFS quota.
  • Custom logging hooks: You can augment your Git hooks to log events. For instance, in the pre-push hook (which LFS already uses), you could add logging to a file whenever LFS uploads something (e.g., log “Pushed model X of size Y MB at time Z”). Over time, this log could help analyze usage patterns (like which assets change frequently).
  • Ensure backups of LFS content: Git LFS storage (especially on third-party hosts) should be reliable, but it doesn't hurt to have backups of your raw assets outside Git as well. Since LFS content isn't stored in the bare Git repository clone, an extra backup of the Assets/ directory (or wherever large files live) could be part of your observability plan (to cross-verify nothing is lost). This is more about data safety than logging, but worth mentioning.
  • Integration with issue tracker: If a build fails due to missing LFS files or a quota issue, make it visible. For instance, if your CI hits an LFS error, configure it to post a message in your team chat or issue tracker. Typical LFS-related failures (auth, missing files, etc.) should be brought to devs' attention immediately, as they might not be as familiar as with code merge conflicts. Logging these incidents will help refine your workflow (e.g., “we had 3 incidents of missing LFS files in CI last month, maybe we need to improve our CI checkout step”).

10) FAQ

  • Q: Do I need special hardware or a separate server to use Git LFS?

A: No, you don't need any new hardware. Git LFS is a service provided by platforms like GitHub, GitLab, etc., using their servers as storage. You just install the client on your PC. The heavy lifting (storing the big files) is on the host's side (e.g., GitHub's cloud). Just ensure your local machine has enough disk space to hold the large files when you work with them.

  • Q: Which hosting services support Git LFS?

A: Most major Git hosting providers support LFS, including GitHub, GitLab, Bitbucket, Azure DevOps, etc. Some require it to be enabled per repository (e.g., GitLab has a setting, Bitbucket Cloud is on by default). If using a self-hosted Git server, you might need to set up an LFS server or use something like GitLab CE/EE which has LFS support built-in. Always check the docs for your platform - e.g., GitLab's docs note you can store large files outside the repo with LFS.

  • Q: Is Git LFS free? What are the costs?

A: The Git LFS client is free and open source. However, storing data on a host may incur costs. For example, GitHub gives 2 GB free LFS storage and 1 GB/month bandwidth. Above that, it's currently about $5 per additional 50 GB storage/month. GitLab's free tier gives 10 GB per repo (including LFS) but with transfer limits. Always review the pricing if your project is large. For open-source projects, sometimes sponsors or special programs might cover LFS costs, but generally budget for it if you have lots of huge files.

  • Q: Can I use Git LFS for my production game assets?

A: Yes, many game studios (especially small to mid-size teams) use Git LFS in production. It allows programmers and artists to collaborate in one repo. Just be mindful of the limitations: if your game's assets are hundreds of gigabytes, Git LFS might become cumbersome (both cost and speed-wise). In such cases, solutions like Perforce or dedicated asset sync tools (or simply storing assets on cloud storage and writing scripts to fetch them) might scale better. But for a lot of projects, LFS is a perfectly fine production solution. We've seen examples of Unreal Engine teams using Git LFS for 3D assets, with the caveat that discipline is needed to not version control every derived file and to purge unused assets periodically.

  • Q: Can I store machine learning models or other large binary data in LFS?

A: Absolutely, that's another common use. LFS is often used for ML model weights, datasets, etc., since those can be hundreds of MBs. Just keep in mind the same considerations about size. Some communities use alternatives like git-annex or DVC for very large data science files, which offer more granular control (and avoid GitHub's bandwidth limits). If using LFS for such data, you might consider GitHub's paid options or even Hugging Face hub (which uses git under the hood with LFS for models).

  • Q: How do I remove or delete a large file added by mistake?

A: If you committed a huge file to Git (not via LFS) and want to expunge it, you'll need to rewrite history. You can use git filter-repo or BFG Repo Cleaner to strip it out. Then force push. If the file was added via LFS and you simply want to remove it from the project, you can delete it and commit (that removes it from current version). The content still resides in LFS storage though (in case you rollback). To fully delete it from the server, you currently need to contact the host's support or wait for garbage collection (GitHub doesn't yet provide a user-facing purge for specific LFS objects). In practice, if you remove all references to that LFS file (no commits point to it), it should eventually get cleaned up. But don't treat LFS as a temporary storage; be deliberate with what you add.

  • Q: If Git LFS stores files outside Git, do those files get versioned?

A: Yes - every version of your file corresponds to an LFS object tied to a Git commit. You can checkout any commit and you'll get the file version from that time. LFS itself doesn't do diffs, but it keeps the whole file for each version. So you get full version history, just like code (just at the granularity of whole-file changes). You can even use git log or GitHub's UI to see past versions of an asset (though you can't “diff” binary easily, you might download two versions and compare manually). Remember, old versions count against your storage until you purge them from history.

  • Q: Is there a file size too small or too big for LFS?

A: There's no hard minimum size - you could track a 5 KB file in LFS, but that's not useful and just adds overhead. Generally track files that are impractical for Git: a common threshold is around 50 MB (GitHub starts warning at 50 MB, disallows over 100 MB). As for too big, Git LFS can handle multi-gigabyte files in theory, but performance may suffer on really big files depending on your network. Also, hosts like GitHub limit single LFS objects to a max of 2 GB or so (this isn't super well-documented, but many have a soft limit). If you have a 10 GB file, LFS might not be ideal (and you'd hit storage limits fast). In such cases, consider splitting the file or using an external storage.

  • Q: Can I use Git LFS alongside other Git extensions (like Git-crypt, submodules, etc.)?

A: Yes, but test it. Git-crypt (for encrypting files in repo) doesn't directly integrate with LFS; if you need to encrypt large files, perhaps encrypt them before adding to LFS. Submodules can themselves contain LFS content (just treat each submodule as its own repo with LFS). Just remember that any environment using those submodules also needs LFS. Tools like DVC (Data Version Control) can work with Git LFS (DVC can even manage LFS pointers for you). The key is understanding each tool's role and ensuring they don't step on each other's toes in the Git hooks or filters.

  • Q: After migrating to LFS, my Git repo is smaller but my .git folder locally is still huge. Why?

A: If you migrated history, your local clone might still have the old objects in the .git folder (dangling blobs). Running git gc (garbage collect) may help, but it won't remove already pushed data easily. If you want a truly clean local, the simplest way is to clone the repository fresh after the migration. That way you only get the new slim history and the LFS files. Your old clone, unless you run aggressive cleaning, might still house the old objects in the pack files. On GitHub's side, the repo size visible should drop, but they too might still count the old stuff until a background GC runs or support intervenes.


11) SEO Title Options

  • “How to Use Git LFS to Manage Large 3D Models and Textures (Step-by-Step Guide)”
  • “Version Control Large 3D Assets (incl. Gaussian Splat point clouds) with GitHub LFS”
  • “Git LFS for Game Dev: Managing Big Models & Textures Without Repo Bloat”
  • “Complete Workflow for Handling Large Files in Git - from CAD Models to Photogrammetry Data”
  • “GitHub Large File Storage (LFS) Tutorial: Storing 3D Assets and High-Res Images”

12) Changelog

  • 2026-01-06 - Verified with Git LFS v3.7.1 on GitHub (Free tier). Tested pushing 3D model (~150 MB .fbx) and texture (~120 MB .png) files in a sample repo. Confirmed LFS tracking, push/pull, and quota behaviors with current GitHub limits. All steps and commands updated for latest Git and LFS versions.