Need to get pixel averages of a vector sitting on a bitmap...
Posted
by user346511
on Stack Overflow
See other posts from Stack Overflow
or by user346511
Published on 2010-05-21T00:57:43Z
Indexed on
2010/05/21
1:00 UTC
Read the original article
Hit count: 234
image-manipulation
|python
I'm currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I'd like to overlay a triangle vector onto an image and get the average pixel data within the bounds of that vector. However, I'm unfamiliar with the math needed to calculate this. Does anyone have an algorithm or a link that could send me in the right direction? I'm not even clear what this type of math is called.
I've created a basic image of what I'm trying to capture here: http://imgur.com/Isjip.gif
© Stack Overflow or respective owner