开发者

OpenCV persistent object tracking and hysteresis strategy

开发者 https://www.devze.com 2023-04-11 13:37 出处:网络
I\'m building an object tracking API for my team. My code will recognize foreground objects in the camera scene. Over time, it will call methods for addObject(id, pos), updateObject(id, newPos), and

I'm building an object tracking API for my team.

My code will recognize foreground objects in the camera scene. Over time, it will call methods for addObject(id, pos), updateObject(id, newPos), and removeObject(id) on instances that implement my listener interface. These are post frame-processing events -- so they might occur 30 times a second.

How can I make sure that objects don't flicker in and out of existence? I need to give objects a minimum lifetime. If an object disappears for one frame and reappears in the same spot in the next frame with a new ID, that is also undesired flickering.


(My thoughts so far) I have thought about using an object mask accumulator as a basis for instantiation. I imagine a grayscale image, where candidate regions for objects would be intensified in the accumulator each frame, then as soon as an object region exceeds a threshold, it gets instantiated and we call the addObject(id, pos) method. Now, the problem with this is, an object can hover around the threshold in the accumulator and can still exhibit flickery behavior. So then, I would add some constant value to the object region as soon as 开发者_开发百科it is instantiated so that it would have a lifetime in the accumulator. This constant would be subtracted when the region crosses below the accumulator threshold.


I ended up using the solution described in my question, almost without any further tweaks. It worked well for what I was doing. It is necessary to keep a copy of the previous accumulator state, though, to determine whether the pixel values are rising or falling across the threshold value.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号