2020-08-13 00:26:06 -0600 | received badge | ● Famous Question (source) |
2017-02-20 04:50:14 -0600 | received badge | ● Notable Question (source) |
2016-05-20 10:46:13 -0600 | received badge | ● Popular Question (source) |
2014-09-23 03:06:01 -0600 | commented question | How to Deploy and Distribute application Dependency walker may not find all the dependencies. You can make dependency walker find more dependencies by profiling your app in it. But there is still no warranty that you find all of them. |
2014-08-31 23:42:48 -0600 | received badge | ● Student (source) |
2014-08-31 03:49:31 -0600 | answered a question | cvGetMinMaxHistValue in C++ Thanks to berak |
2014-08-31 03:44:56 -0600 | commented question | Representating 2-D array of doubles by putting in Mat and displaying. Yes, I remember. Thanks. But I initialize Mat in a loop explicitly using Mat::at<double>(). The actual question is about information loss when display Mat CV_64F with imshow(). |
2014-08-31 03:38:29 -0600 | answered a question | Why do I need both static and dynamic libraries to build OpenCV application? These static libraries are actually import libraries for dynamic libraries. These static libraries are useful to load dynamic libraries implicitly, without invoking LoadLibrary(), GetProcAddress() and so on. Just define exported functions from dll in a header file and invoke them as usual. |
2014-08-31 03:26:13 -0600 | asked a question | Representating 2-D array of doubles by putting in Mat and displaying. I've a got a 2-D array of double values. I want to put them in Mat and display. In other words, I want to represent my 2-D array of doubles in color. Should I scale these double values somehow before I put them in Mat? Or what? I can't manage to represent all the values vividly to see the differences between them. |
2014-08-16 00:34:50 -0600 | received badge | ● Citizen Patrol (source) |
2014-08-14 02:31:30 -0600 | asked a question | Why imshow doesn't always show pixel values in a window after zoom? In parcticular, GUI doesn't show pixels values after zoom for image type CV_16UC1. But for CV_8UC1 it shows. What is it? |
2014-08-13 05:08:59 -0600 | asked a question | vector <vector <Point2f> > to Mat? I've got vector of vectors of Point2f's. I need to:
I'm confused and stuck with cv:Mat and std::vector. What is the best way to store and process my data...? Couldn't you, please, suggest an easy and simple way to perform actions listed above? Thanks in advance. |
2014-08-11 11:27:58 -0600 | received badge | ● Scholar (source) |
2014-08-11 09:41:58 -0600 | commented answer | Contours sorting. But std::stable_sort() instead of std::sort() works fine! |
2014-08-11 09:00:34 -0600 | commented answer | Contours sorting. Sorry, but this is not my case, not my problem. My problem is segmentation fault while step through compare function. |
2014-08-11 07:34:24 -0600 | commented answer | Contours sorting. I've got vector<Point2f> instead of vector<Point>. Does it really matter...? And this kind of code doesn't work on MinGW for me. :( It causes segmentation fault. |
2014-08-11 03:52:03 -0600 | asked a question | Contours sorting. Are contours returned by findContours() sorted somehow? It seems that they aren't. In fact, I've got 1D std::vector of centroids of contours. (Got with the help of moments()) I need to sort these centroids and "reduce" this 1D vector into 2D vector (vector of vectors) so that I could easily navigate among these centroids in two dimensional domain by i and j indexes. I should also insert so called "missed" centroids (e.g. as negative ones) to make 2D array square or rectangular (alignment). Centroids form fine grid so I can measure distances between them on x and y coordinates to find out is there a "missed" centroid or two or several ones between two existing ones. Couldn't you suggest an easy simple way to perform this? |
2014-08-10 06:56:11 -0600 | commented question | Why do I need both static and dynamic libraries to build OpenCV application? These static libraries are actually import libraries. I've confused them. |
2014-08-08 21:41:00 -0600 | commented answer | how to delete repeating coordinates of vector<Point2f> try std::unordered_set |
2014-08-07 08:53:09 -0600 | commented question | Why do I need both static and dynamic libraries to build OpenCV application? There is no problem. The question is not about how to fix smth, but why it is so. This question might be about the essentials of using libraries in programming and might not relate to OpenCV itself. It is general, perhaps. The fact is that this is my first experience using dynamic libraries explicitly. Before this experience I thought that if I have dynamic libraries, then I don't need static ones. But it seems that they are required. Why? |
2014-08-06 23:04:53 -0600 | asked a question | Why do I need both static and dynamic libraries to build OpenCV application? I've installed OpenCV from sources with MinGW on 64-bit MS Windows 7. I've chosen to build dynamic libraries. But when I build OpenCV application I need not only dynamic libraries being available for loading at runtime but I also need to link some static libraries while building binaries, othewise I get error messages from the linker about unresolved external symbols - you know what I mean. So why do I need both static and dynamic libraries to build OpenCV application? |
2014-08-06 03:05:45 -0600 | asked a question | cvGetMinMaxHistValue in C++ How can I get maximum histogram value using C++ API? |
2014-08-02 09:45:49 -0600 | commented answer | Detecting spreaded white spots of light on a noisy gray-scale image. Thank you for your answer, but... No thing, that is real, is perfect. My images which I need to process are from the real wavefront sensor. |
2014-08-02 08:07:05 -0600 | commented answer | Detecting spreaded white spots of light on a noisy gray-scale image. That sounds true. Thanx. I've also tried to use bilateral filter first of all. - And it's a really good practice for benefit. |
2014-08-02 04:54:22 -0600 | received badge | ● Supporter (source) |