

maybe, maybe not.
when h264 was introduced (Aug 2004), even intel had HW encoding for it with sandybridge in 2011. nvidia had at 2012
so less than 7 years.
av1 was first introduced 7 years ago and for at least two years android TVs require HW decoding for it.
And AMD rdna2 had the same 4 years ago.
so from introduction to hardware decoding it took 3 years.
I have no idea why 10 years is thrown around.
and av1 had to compete with h264 and h265 both. ( they had to decide if it was worth implementing it)
you didn’t do the wrong thing.
what many people don’t notice is that support for a codec in gpu(in hardware) is two part. one is decoding and one is encoding.
for quality video nobody does hardware encoding (at least not on consumer systems linux this 3050 nvidia)
for most users the important this is hardware support for decoding so that they can watch their 4k movie with no issue.
so you are in the clear.
you can watch av1 right now and when av2 becomes popular enough to be used in at least 4 years from now.