Papers
arxiv:2406.09414

Depth Anything V2

Published on Jun 13
ยท Submitted by akhaliq on Jun 14
#1 Paper of the day
Authors:
,
,
,
,

Abstract

This work presents Depth Anything V2. Without pursuing fancy techniques, we aim to reveal crucial findings to pave the way towards building a powerful monocular depth estimation model. Notably, compared with V1, this version produces much finer and more robust depth predictions through three key practices: 1) replacing all labeled real images with synthetic images, 2) scaling up the capacity of our teacher model, and 3) teaching student models via the bridge of large-scale pseudo-labeled real images. Compared with the latest models built on Stable Diffusion, our models are significantly more efficient (more than 10x faster) and more accurate. We offer models of different scales (ranging from 25M to 1.3B params) to support extensive scenarios. Benefiting from their strong generalization capability, we fine-tune them with metric depth labels to obtain our metric depth models. In addition to our models, considering the limited diversity and frequent noise in current test sets, we construct a versatile evaluation benchmark with precise annotations and diverse scenes to facilitate future research.

Community

Paper submitter

Screen Shot 2024-06-13 at 9.50.14 PM.png

Super cool to see all related model, dataset and Space linked to the paper! ๐Ÿ”ฅ

Why did it get removed from GitHub?

ยท
Paper author

Hi, our DepthAnything organization has been flagged by Github. So is our homepage account. They are hidden from the public now. We are still appealing to recover them. Sorry for the inconvenience.

Thanks so much for all your efforts, the new any version 2 is really good, i agree with another user, can we get a depth anything V2 metric small as this would help be a faster version similar to zoe any n but faster if you combine it with the new any s?

ยท
Paper author

Hi, our smaller metric depth models have been released: https://github.com/DepthAnything/Depth-Anything-V2/tree/main/metric_depth

Paper author

Hi @kaelsonofkrypto and @loawizard , thank you for your interest. We will train two smaller metric depth models based on Depth-Anything-Small and Depth-Anything-Base, respectively. Hopefully, we will release them within five days. Please stay tuned!

Sign up or log in to comment

Models citing this paper 21

Browse 21 models citing this paper

Datasets citing this paper 1

Spaces citing this paper 15

Collections including this paper 32