1920x1080 is HDTV resolution, I was working with a firewire camera which was outputting an image nearly this resolution. It was not real time processing, but it was online processing (grab a frame and process it to calibrate, and for display in an Augmented Reality system).
So changing the size for your system should be fine, but probably wont be accepted as a patch into player. This has been a long standing problem with player having the memory footprint of the worst case user. This will be fixed soon with my work to utilize geoffs dynamic memory work for all player interfaces. This is still a work in progress, but progress is good, A patch should be ready for it soon.
As for the *2 memory allocation, its an efficiency move, its not allocating twice as much as it needs, its allocating twice as much as it previously had. This is so it doesnt need to reallocate and move the write buffer each time it fills up. A range of different algorithms could be used here, the doubling is just a simple effective one, but perhaps not optimal in your low memory environment.
I've mentioned this topic in one of my previous post. I'm running player
on tiny embedded mainboard with only 32MB RAM. There's currently USB
webcam that (finally, thanks to recent patches) works fine with CVS
Player. Unfortunately, when I want to receive image data, Player server
fails with out-of-memory assertion:
player: playertcp.cc:618: int PlayerTCP::WriteClient(int): Assertion
I've put two PLAYER_WARN's into the playertcp.cc code and what I can see
warning : maxsize 33177752 > client->writebuffersize 65536
warning : allocating 33177752 bytes
player: playertcp.cc :624: int PlayerTCP::WriteClient(int): Assertion
So it's trying to allocate more memory than the system can provide
(remember that kernel with camera modules need to live somewhere).
My first trick that I was using so far was to change two definitions in
player_interfaces.h header file (libplayercore):
/** Maximum image width, in pixels */
-#define PLAYER_CAMERA_IMAGE_WIDTH 1920
+#define PLAYER_CAMERA_IMAGE_WIDTH 640
/** Maximum image height, in pixels */
-#define PLAYER_CAMERA_IMAGE_HEIGHT 1080
+#define PLAYER_CAMERA_IMAGE_HEIGHT 480
/** Maximum image size, in pixels */
After applying this changes, warns that I've put into the code says:
warning : maxsize 4915352 > client->writebuffersize 65536
warning : allocating 4915352 bytes
...and of course, there's no Player breakdown caused by assertion.
Fortunately, it works (with no changes to client-side code), but is it
only a matter of good luck?
My doubts for this are:
1. Who does really need 1920x1080 resolution for real-time processing?! On
Player 1.6.x for years it was 640x480.
2. Is my change in global definition safe for whole player infrastructure?
(remember that client-side user doesn't really need to know that I've made
this change! does he really need to make similar change to his player
libs code before compiling any client-side software?)
3. Why this isn't a configure option?
My other doubt is about what I've found in playertcp.cc code:
// Get at least twice as much space
client->writebuffersize = MAX((size_t)(client->writebuffersize *
Why do we need at least twice as much space? Of course, half of the amount
of memory that is needed here before changing max res to 640x480 (which
means about 16-17MB) is still too much (on my embedded board there's only
about 15MB free RAM left for Player and all the buffers it allocates), but
still I'm curious what is the reason of this overhead?!
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Playerstage-developers mailing list