You can subscribe to this list here.
| 2002 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
(122) |
Nov
(152) |
Dec
(69) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2003 |
Jan
(6) |
Feb
(25) |
Mar
(73) |
Apr
(82) |
May
(24) |
Jun
(25) |
Jul
(10) |
Aug
(11) |
Sep
(10) |
Oct
(54) |
Nov
(203) |
Dec
(182) |
| 2004 |
Jan
(307) |
Feb
(305) |
Mar
(430) |
Apr
(312) |
May
(187) |
Jun
(342) |
Jul
(487) |
Aug
(637) |
Sep
(336) |
Oct
(373) |
Nov
(441) |
Dec
(210) |
| 2005 |
Jan
(385) |
Feb
(480) |
Mar
(636) |
Apr
(544) |
May
(679) |
Jun
(625) |
Jul
(810) |
Aug
(838) |
Sep
(634) |
Oct
(521) |
Nov
(965) |
Dec
(543) |
| 2006 |
Jan
(494) |
Feb
(431) |
Mar
(546) |
Apr
(411) |
May
(406) |
Jun
(322) |
Jul
(256) |
Aug
(401) |
Sep
(345) |
Oct
(542) |
Nov
(308) |
Dec
(481) |
| 2007 |
Jan
(427) |
Feb
(326) |
Mar
(367) |
Apr
(255) |
May
(244) |
Jun
(204) |
Jul
(223) |
Aug
(231) |
Sep
(354) |
Oct
(374) |
Nov
(497) |
Dec
(362) |
| 2008 |
Jan
(322) |
Feb
(482) |
Mar
(658) |
Apr
(422) |
May
(476) |
Jun
(396) |
Jul
(455) |
Aug
(267) |
Sep
(280) |
Oct
(253) |
Nov
(232) |
Dec
(304) |
| 2009 |
Jan
(486) |
Feb
(470) |
Mar
(458) |
Apr
(423) |
May
(696) |
Jun
(461) |
Jul
(551) |
Aug
(575) |
Sep
(134) |
Oct
(110) |
Nov
(157) |
Dec
(102) |
| 2010 |
Jan
(226) |
Feb
(86) |
Mar
(147) |
Apr
(117) |
May
(107) |
Jun
(203) |
Jul
(193) |
Aug
(238) |
Sep
(300) |
Oct
(246) |
Nov
(23) |
Dec
(75) |
| 2011 |
Jan
(133) |
Feb
(195) |
Mar
(315) |
Apr
(200) |
May
(267) |
Jun
(293) |
Jul
(353) |
Aug
(237) |
Sep
(278) |
Oct
(611) |
Nov
(274) |
Dec
(260) |
| 2012 |
Jan
(303) |
Feb
(391) |
Mar
(417) |
Apr
(441) |
May
(488) |
Jun
(655) |
Jul
(590) |
Aug
(610) |
Sep
(526) |
Oct
(478) |
Nov
(359) |
Dec
(372) |
| 2013 |
Jan
(467) |
Feb
(226) |
Mar
(391) |
Apr
(281) |
May
(299) |
Jun
(252) |
Jul
(311) |
Aug
(352) |
Sep
(481) |
Oct
(571) |
Nov
(222) |
Dec
(231) |
| 2014 |
Jan
(185) |
Feb
(329) |
Mar
(245) |
Apr
(238) |
May
(281) |
Jun
(399) |
Jul
(382) |
Aug
(500) |
Sep
(579) |
Oct
(435) |
Nov
(487) |
Dec
(256) |
| 2015 |
Jan
(338) |
Feb
(357) |
Mar
(330) |
Apr
(294) |
May
(191) |
Jun
(108) |
Jul
(142) |
Aug
(261) |
Sep
(190) |
Oct
(54) |
Nov
(83) |
Dec
(22) |
| 2016 |
Jan
(49) |
Feb
(89) |
Mar
(33) |
Apr
(50) |
May
(27) |
Jun
(34) |
Jul
(53) |
Aug
(53) |
Sep
(98) |
Oct
(206) |
Nov
(93) |
Dec
(53) |
| 2017 |
Jan
(65) |
Feb
(82) |
Mar
(102) |
Apr
(86) |
May
(187) |
Jun
(67) |
Jul
(23) |
Aug
(93) |
Sep
(65) |
Oct
(45) |
Nov
(35) |
Dec
(17) |
| 2018 |
Jan
(26) |
Feb
(35) |
Mar
(38) |
Apr
(32) |
May
(8) |
Jun
(43) |
Jul
(27) |
Aug
(30) |
Sep
(43) |
Oct
(42) |
Nov
(38) |
Dec
(67) |
| 2019 |
Jan
(32) |
Feb
(37) |
Mar
(53) |
Apr
(64) |
May
(49) |
Jun
(18) |
Jul
(14) |
Aug
(53) |
Sep
(25) |
Oct
(30) |
Nov
(49) |
Dec
(31) |
| 2020 |
Jan
(87) |
Feb
(45) |
Mar
(37) |
Apr
(51) |
May
(99) |
Jun
(36) |
Jul
(11) |
Aug
(14) |
Sep
(20) |
Oct
(24) |
Nov
(40) |
Dec
(23) |
| 2021 |
Jan
(14) |
Feb
(53) |
Mar
(85) |
Apr
(15) |
May
(19) |
Jun
(3) |
Jul
(14) |
Aug
(1) |
Sep
(57) |
Oct
(73) |
Nov
(56) |
Dec
(22) |
| 2022 |
Jan
(3) |
Feb
(22) |
Mar
(6) |
Apr
(55) |
May
(46) |
Jun
(39) |
Jul
(15) |
Aug
(9) |
Sep
(11) |
Oct
(34) |
Nov
(20) |
Dec
(36) |
| 2023 |
Jan
(79) |
Feb
(41) |
Mar
(99) |
Apr
(169) |
May
(48) |
Jun
(16) |
Jul
(16) |
Aug
(57) |
Sep
(19) |
Oct
|
Nov
|
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
1
(5) |
2
(11) |
3
|
4
(9) |
5
(10) |
6
(4) |
7
(14) |
|
8
(15) |
9
(15) |
10
(14) |
11
(13) |
12
(16) |
13
(12) |
14
(9) |
|
15
(21) |
16
(13) |
17
(11) |
18
(13) |
19
(5) |
20
(29) |
21
(20) |
|
22
(13) |
23
(18) |
24
(21) |
25
(17) |
26
(26) |
27
(13) |
28
(17) |
|
29
(10) |
30
(5) |
|
|
|
|
|
|
From: Siddharth N. <sn...@dr...> - 2014-06-13 23:43:15
|
Philippe, Thanks very much for that. I ran with what you said on my single-threaded benchmark and got a line like the following just before the tool died. -------- Arena "tool": 31444697088/31444697088 max/curr mmap'd, 0/0 unsplit/split sb unmmap'd, 19661485344/19661485344 max/curr on_loan -------- <Here there is a breakdown of the various mallocs in the code> This is a single-threaded benchmark, which creates just one "tool" arena. This indicates that the arena's size is close to 29.3GB while the actual useable bytes (on_loan) are 18.3G. My calculation of malloc'ed bytes also yields 18.3G so that part is correct. Why is the arena size much larger? I currently do not free any memory, so is the data segment growing in discrete and huge chunks on certain malloc calls? On 13 June 2014 13:30, Philippe Waroquiers <phi...@sk...> wrote: > On Thu, 2014-06-12 at 21:21 -0400, Siddharth Nilakantan wrote: > > Hi All, > > > > > > I did a modification of the Callgrind tool to measure the amount of > > bytes communicated between functions. To do this, I used a shadow > > memory similar to what was described in the "How to shadow every byte" > > paper. As my secondary maps are quite huge (sometimes up to 2.5MB), > > the memory usage of the tool becomes sky high, especially for > > memory-intensive applications. I am keeping track of all my VG_mallocs > > and I periodically calculate the heap usage. > > > > > > However, I noticed that for some memory-intensive benchmarks, that the > > VIRT reported by Linux is much larger than my estimate of memory. For > > example, in one scenario, I report 18Gb of allocated memory, while the > > VIRT report by "top" shows 32Gb. The tool stops running on the next > > memory allocation request and Valgrind says that it cannot allocate > > anymore memory. I checked my calculation of heap usage several times > > and I believe its correct. > > > > > > I tried running the tool under Memcheck itself to see if there are any > > leaks, but Memcheck does not seem to report anything wrong. In fact, > > judging by the numbers I see, I'm not sure Memcheck is seeing a lot of > > allocations requested by my tool. > > > > > > Any theories on why this may be happening? Is there a way I can > > periodically check if the number of new virtual pages requested during > > a malloc is greater than the actual malloc'ed memory? > While the tool is running, you could from the shell launch at regular > interval : > vgdb v.info memory aspacemgr > > This will report the size of each valgrind arena, > and (in the valgrind process output file) gives the detail of > the mmap-ed blocks by the valgrind adress space manager. > > You might also try to run with --profile-heap=yes : this gives a lot > more details about the content of the valgrind arenas, but that costs > some overhead per allocated blocks. > > Philippe > > > > |
|
From: Philippe W. <phi...@sk...> - 2014-06-13 17:30:37
|
On Thu, 2014-06-12 at 21:21 -0400, Siddharth Nilakantan wrote: > Hi All, > > > I did a modification of the Callgrind tool to measure the amount of > bytes communicated between functions. To do this, I used a shadow > memory similar to what was described in the "How to shadow every byte" > paper. As my secondary maps are quite huge (sometimes up to 2.5MB), > the memory usage of the tool becomes sky high, especially for > memory-intensive applications. I am keeping track of all my VG_mallocs > and I periodically calculate the heap usage. > > > However, I noticed that for some memory-intensive benchmarks, that the > VIRT reported by Linux is much larger than my estimate of memory. For > example, in one scenario, I report 18Gb of allocated memory, while the > VIRT report by "top" shows 32Gb. The tool stops running on the next > memory allocation request and Valgrind says that it cannot allocate > anymore memory. I checked my calculation of heap usage several times > and I believe its correct. > > > I tried running the tool under Memcheck itself to see if there are any > leaks, but Memcheck does not seem to report anything wrong. In fact, > judging by the numbers I see, I'm not sure Memcheck is seeing a lot of > allocations requested by my tool. > > > Any theories on why this may be happening? Is there a way I can > periodically check if the number of new virtual pages requested during > a malloc is greater than the actual malloc'ed memory? While the tool is running, you could from the shell launch at regular interval : vgdb v.info memory aspacemgr This will report the size of each valgrind arena, and (in the valgrind process output file) gives the detail of the mmap-ed blocks by the valgrind adress space manager. You might also try to run with --profile-heap=yes : this gives a lot more details about the content of the valgrind arenas, but that costs some overhead per allocated blocks. Philippe |
|
From: Philippe W. <phi...@sk...> - 2014-06-13 04:55:18
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (GCC) 4.7.2 20121109 (Red Hat 4.7.2-8) GDB: GNU gdb (GDB) Fedora (7.5.1-37.fc18) Assembler: GNU assembler version 2.23.51.0.1-7.fc18 20120806 C library: GNU C Library stable release version 2.16 uname -mrs: Linux 3.8.8-202.fc18.ppc64p7 ppc64 Vendor version: Fedora release 18 (Spherical Cow) Nightly build on gcc110 ( Fedora release 18 (Spherical Cow), ppc64 ) Started at 2014-06-12 20:00:14 PDT Ended at 2014-06-12 21:51:54 PDT Results unchanged from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 582 tests, 9 stderr failures, 3 stdout failures, 0 stderrB failures, 0 stdoutB failures, 2 post failures == memcheck/tests/linux/getregset (stdout) memcheck/tests/linux/getregset (stderr) memcheck/tests/ppc64/power_ISA2_05 (stdout) memcheck/tests/supp_unknown (stderr) memcheck/tests/varinfo6 (stderr) memcheck/tests/wrap8 (stdout) memcheck/tests/wrap8 (stderr) massif/tests/big-alloc (post) massif/tests/deep-D (post) helgrind/tests/pth_cond_destroy_busy (stderr) helgrind/tests/tc06_two_races_xml (stderr) helgrind/tests/tc18_semabuse (stderr) helgrind/tests/tc20_verifywrap (stderr) drd/tests/boost_thread (stderr) --tools=none,memcheck,callgrind,helgrind,cachegrind,drd,massif --reps=3 --vg=../valgrind-new --vg=../valgrind-old -- Running tests in perf ---------------------------------------------- -- bigcode1 -- bigcode1 valgrind-new:0.26s no: 1.7s ( 6.6x, -----) me: 2.9s (11.1x, -----) ca:18.0s (69.1x, -----) he: 1.7s ( 6.7x, -----) ca: 5.3s (20.3x, -----) dr: 1.7s ( 6.5x, -----) ma: 2.1s ( 8.2x, -----) bigcode1 valgrind-old:0.26s no: 1.6s ( 6.0x, 8.2%) me: 2.9s (11.0x, 0.7%) ca:17.9s (68.8x, 0.4%) he: 1.8s ( 6.8x, -1.7%) ca: 5.7s (21.9x, -7.6%) dr: 1.9s ( 7.2x,-10.7%) ma: 2.1s ( 8.2x, 0.5%) -- bigcode2 -- bigcode2 valgrind-new:0.26s no: 1.7s ( 6.7x, -----) me: 3.0s (11.6x, -----) ca:18.5s (71.1x, -----) he: 2.4s ( 9.1x, -----) ca: 5.6s (21.4x, -----) dr: 1.8s ( 7.1x, -----) ma: 2.1s ( 8.2x, -----) bigcode2 valgrind-old:0.26s no: 1.6s ( 6.0x, 10.3%) me: 3.2s (12.2x, -5.0%) ca:18.6s (71.6x, -0.6%) he: 2.1s ( 8.3x, 8.9%) ca: 6.3s (24.3x,-13.7%) dr: 2.0s ( 7.7x, -8.7%) ma: 2.3s ( 8.8x, -7.0%) -- bz2 -- bz2 valgrind-new:0.75s no: 4.8s ( 6.4x, -----) me:11.8s (15.8x, -----) ca:26.3s (35.0x, -----) he:14.9s (19.8x, -----) ca:24.2s (32.3x, -----) dr:19.1s (25.5x, -----) ma: 4.8s ( 6.4x, -----) bz2 valgrind-old:0.75s no: 4.5s ( 6.1x, 5.6%) me:11.6s (15.5x, 1.8%) ca:25.8s (34.4x, 1.8%) he:14.8s (19.7x, 0.7%) ca:24.9s (33.2x, -2.9%) dr:19.1s (25.5x, 0.1%) ma: 4.7s ( 6.2x, 2.1%) -- fbench -- fbench valgrind-new:0.34s no: 2.1s ( 6.2x, -----) me: 5.2s (15.3x, -----) ca: 8.5s (24.9x, -----) he: 5.3s (15.6x, -----) ca: 7.4s (21.8x, -----) dr: 4.8s (14.2x, -----) ma: 2.1s ( 6.3x, -----) fbench valgrind-old:0.34s no: 2.2s ( 6.6x, -7.1%) me: 5.2s (15.4x, -0.6%) ca: 8.5s (25.1x, -0.8%) he: 5.2s (15.4x, 1.5%) ca: 7.4s (21.9x, -0.3%) dr: 4.8s (14.1x, 1.0%) ma: 2.1s ( 6.3x, 0.0%) -- ffbench -- ffbench valgrind-new:0.45s no: 1.3s ( 2.9x, -----) me: 2.4s ( 5.4x, -----) ca: 2.5s ( 5.5x, -----) he: 6.9s (15.3x, -----) ca: 7.0s (15.5x, -----) dr: 4.9s (10.9x, -----) ma: 1.0s ( 2.2x, -----) ffbench valgrind-old:0.45s no: 1.3s ( 2.9x, 0.8%) me: 2.5s ( 5.5x, -1.2%) ca: 2.5s ( 5.6x, -0.8%) he: 6.9s (15.4x, -0.6%) ca: 7.1s (15.7x, -1.6%) dr: 4.9s (11.0x, -0.2%) ma: 1.0s ( 2.2x, 1.0%) -- heap -- heap valgrind-new:0.41s no: 2.4s ( 5.9x, -----) me: 9.6s (23.3x, -----) ca:13.2s (32.2x, -----) he:11.9s (29.1x, -----) ca:12.1s (29.6x, -----) dr: 8.0s (19.5x, -----) ma: 8.8s (21.5x, -----) heap valgrind-old:0.41s no: 2.4s ( 6.0x, -0.8%) me: 9.6s (23.3x, 0.0%) ca:13.1s (32.0x, 0.6%) he:11.8s (28.8x, 1.3%) ca:12.2s (29.8x, -0.7%) dr: 8.0s (19.4x, 0.4%) ma: 8.9s (21.7x, -0.8%) -- heap_pdb4 -- heap_pdb4 valgrind-new:0.41s no: 2.9s ( 7.1x, -----) me:13.7s (33.3x, -----) ca:14.8s (36.1x, -----) he:13.5s (33.0x, -----) ca:13.2s (32.2x, -----) dr: 8.9s (21.7x, -----) ma: 8.7s (21.3x, -----) heap_pdb4 valgrind-old:0.41s no: 2.7s ( 6.6x, 6.9%) me:13.6s (33.0x, 0.8%) ca:14.7s (35.8x, 0.7%) he:13.1s (32.0x, 3.2%) ca:13.1s (32.0x, 0.6%) dr: 9.0s (22.0x, -1.2%) ma: 8.8s (21.4x, -0.7%) -- many-loss-records -- many-loss-records valgrind-new:0.03s no: 0.5s (17.7x, -----) me: 2.1s (71.0x, -----) ca: 1.9s (62.3x, -----) he: 1.8s (61.0x, -----) ca: 1.9s (62.3x, -----) dr: 1.6s (52.3x, -----) ma: 1.6s (52.3x, -----) many-loss-records valgrind-old:0.03s no: 0.5s (17.7x, 0.0%) me: 2.1s (71.3x, -0.5%) ca: 1.9s (62.7x, -0.5%) he: 1.8s (61.0x, 0.0%) ca: 1.9s (61.7x, 1.1%) dr: 1.6s (52.7x, -0.6%) ma: 1.6s (52.3x, 0.0%) -- many-xpts -- many-xpts valgrind-new:0.07s no: 0.7s (10.6x, -----) me: 3.3s (47.4x, -----) ca: 4.7s (66.7x, -----) he: 4.9s (69.6x, -----) ca: 2.9s (41.1x, -----) dr: 2.3s (33.3x, -----) ma: 2.3s (32.3x, -----) many-xpts valgrind-old:0.07s no: 0.7s (10.6x, 0.0%) me: 3.3s (47.4x, 0.0%) ca: 4.7s (67.0x, -0.4%) he: 4.9s (69.4x, 0.2%) ca: 2.9s (41.4x, -0.7%) dr: 2.3s (33.3x, 0.0%) ma: 2.3s (32.3x, 0.0%) -- sarp -- sarp valgrind-new:0.02s no: 0.4s (20.0x, -----) me: 3.0s (151.5x, -----) ca: 3.0s (148.0x, -----) he:11.1s (553.0x, -----) ca: 1.9s (95.0x, -----) dr: 1.1s (54.5x, -----) ma: 0.4s (21.0x, -----) sarp valgrind-old:0.02s no: 0.4s (19.5x, 2.5%) me: 3.0s (152.0x, -0.3%) ca: 3.0s (149.0x, -0.7%) he:11.2s (559.5x, -1.2%) ca: 1.7s (83.0x, 12.6%) dr: 1.1s (55.0x, -0.9%) ma: 0.4s (20.5x, 2.4%) -- tinycc -- tinycc valgrind-new:0.28s no: 3.0s (10.6x, -----) me:13.7s (48.9x, -----) ca:17.3s (61.6x, -----) he:18.9s (67.5x, -----) ca:15.6s (55.6x, -----) dr:12.0s (42.8x, -----) ma: 3.8s (13.6x, -----) tinycc valgrind-old:0.28s no: 3.0s (10.6x, 0.3%) me:13.7s (48.9x, 0.0%) ca:17.3s (61.9x, -0.3%) he:18.9s (67.6x, -0.2%) ca:15.6s (55.6x, 0.1%) dr:12.0s (43.0x, -0.3%) ma: 3.8s (13.6x, 0.0%) -- Finished tests in perf ---------------------------------------------- == 11 programs, 154 timings ================= real 53m27.893s user 52m28.220s sys 0m19.018s |
|
From: Christian B. <bor...@de...> - 2014-06-13 04:13:28
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (SUSE Linux) 4.3.4 [gcc-4_3-branch revision 152973] GDB: GNU gdb (GDB) SUSE (7.5.1-0.7.29) Assembler: GNU assembler (GNU Binutils; SUSE Linux Enterprise 11) 2.23.1 C library: GNU C Library stable release version 2.11.3 (20110527) uname -mrs: Linux 3.0.101-0.21-default s390x Vendor version: Welcome to SUSE Linux Enterprise Server 11 SP3 (s390x) - Kernel %r (%t). Nightly build on sless390 ( SUSE Linux Enterprise Server 11 SP3 gcc 4.3.4 on z196 (s390x) ) Started at 2014-06-13 03:45:01 CEST Ended at 2014-06-13 06:13:16 CEST Results unchanged from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 647 tests, 2 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == helgrind/tests/pth_cond_destroy_busy (stderr) helgrind/tests/tc20_verifywrap (stderr) --tools=none,memcheck,callgrind,helgrind,cachegrind,drd,massif --reps=3 --vg=../valgrind-new --vg=../valgrind-old -- Running tests in perf ---------------------------------------------- -- bigcode1 -- bigcode1 valgrind-new:0.22s no: 4.4s (20.1x, -----) me: 6.9s (31.3x, -----) ca:26.4s (120.0x, -----) he: 5.1s (23.2x, -----) ca: 9.2s (41.6x, -----) dr: 5.4s (24.4x, -----) ma: 4.6s (20.8x, -----) bigcode1 valgrind-old:0.22s no: 4.4s (19.8x, 1.6%) me: 6.9s (31.4x, -0.1%) ca:26.4s (119.9x, 0.1%) he: 5.1s (23.3x, -0.2%) ca: 9.1s (41.5x, 0.1%) dr: 5.4s (24.5x, -0.4%) ma: 4.6s (21.0x, -1.1%) -- bigcode2 -- bigcode2 valgrind-new:0.24s no: 7.3s (30.5x, -----) me:13.7s (57.2x, -----) ca:39.5s (164.8x, -----) he:10.1s (41.9x, -----) ca:14.3s (59.5x, -----) dr: 9.5s (39.5x, -----) ma: 8.0s (33.5x, -----) bigcode2 valgrind-old:0.24s no: 7.3s (30.5x, -0.1%) me:13.8s (57.5x, -0.4%) ca:39.5s (164.7x, 0.1%) he:10.1s (42.3x, -0.8%) ca:14.3s (59.4x, 0.1%) dr: 9.6s (39.8x, -1.0%) ma: 8.0s (33.2x, 1.0%) -- bz2 -- bz2 valgrind-new:0.70s no: 5.1s ( 7.3x, -----) me:12.7s (18.2x, -----) ca:30.7s (43.8x, -----) he:19.6s (28.0x, -----) ca:34.3s (49.0x, -----) dr:29.0s (41.4x, -----) ma: 3.9s ( 5.6x, -----) bz2 valgrind-old:0.70s no: 5.1s ( 7.3x, 0.0%) me:12.8s (18.3x, -0.5%) ca:30.7s (43.8x, 0.0%) he:19.6s (28.0x, 0.1%) ca:34.3s (49.0x, -0.0%) dr:29.0s (41.4x, -0.0%) ma: 4.0s ( 5.7x, -1.8%) -- fbench -- fbench valgrind-new:0.41s no: 1.6s ( 3.9x, -----) me: 4.2s (10.2x, -----) ca: 9.3s (22.7x, -----) he: 6.2s (15.2x, -----) ca: 7.2s (17.6x, -----) dr: 5.5s (13.5x, -----) ma: 1.7s ( 4.1x, -----) fbench valgrind-old:0.41s no: 1.6s ( 3.9x, 0.0%) me: 4.2s (10.3x, -0.2%) ca: 9.3s (22.6x, 0.2%) he: 6.2s (15.2x, 0.2%) ca: 7.2s (17.6x, -0.1%) dr: 5.5s (13.5x, 0.4%) ma: 1.7s ( 4.1x, -0.0%) -- ffbench -- ffbench valgrind-new:0.20s no: 1.1s ( 5.3x, -----) me: 3.0s (15.0x, -----) ca: 3.0s (15.1x, -----) he:44.1s (220.4x, -----) ca: 9.6s (48.0x, -----) dr: 6.9s (34.6x, -----) ma: 1.0s ( 4.8x, -----) ffbench valgrind-old:0.20s no: 1.1s ( 5.2x, 0.9%) me: 3.0s (15.0x, 0.0%) ca: 3.0s (15.1x, 0.0%) he:44.1s (220.4x, 0.0%) ca: 9.6s (48.0x, 0.0%) dr: 7.1s (35.4x, -2.3%) ma: 1.0s ( 4.8x, 1.0%) -- heap -- heap valgrind-new:0.24s no: 1.9s ( 7.8x, -----) me: 8.6s (35.8x, -----) ca:13.2s (55.0x, -----) he:12.7s (52.8x, -----) ca:11.2s (46.8x, -----) dr: 7.7s (32.2x, -----) ma: 7.9s (33.0x, -----) heap valgrind-old:0.24s no: 1.9s ( 7.8x, 0.0%) me: 8.7s (36.2x, -1.3%) ca:13.1s (54.6x, 0.7%) he:12.8s (53.2x, -0.8%) ca:11.2s (46.9x, -0.2%) dr: 8.3s (34.8x, -7.9%) ma: 7.9s (32.8x, 0.5%) -- heap_pdb4 -- heap_pdb4 valgrind-new:0.22s no: 2.1s ( 9.7x, -----) me:12.7s (57.8x, -----) ca:14.2s (64.5x, -----) he:14.2s (64.4x, -----) ca:12.4s (56.2x, -----) dr: 8.5s (38.5x, -----) ma: 8.1s (36.6x, -----) heap_pdb4 valgrind-old:0.22s no: 2.1s ( 9.4x, 3.3%) me:12.7s (57.9x, -0.2%) ca:14.2s (64.5x, -0.1%) he:14.2s (64.3x, 0.1%) ca:12.4s (56.2x, 0.0%) dr: 9.2s (42.0x, -9.1%) ma: 8.1s (36.6x, 0.0%) -- many-loss-records -- many-loss-records valgrind-new:0.02s no: 0.5s (24.0x, -----) me: 2.1s (103.0x, -----) ca: 1.9s (97.0x, -----) he: 2.2s (108.5x, -----) ca: 1.9s (95.5x, -----) dr: 1.7s (86.5x, -----) ma: 1.7s (83.5x, -----) many-loss-records valgrind-old:0.02s no: 0.5s (24.0x, 0.0%) me: 2.1s (103.0x, 0.0%) ca: 1.9s (96.5x, 0.5%) he: 2.2s (108.5x, 0.0%) ca: 1.9s (95.5x, 0.0%) dr: 1.8s (89.5x, -3.5%) ma: 1.7s (83.5x, 0.0%) -- many-xpts -- many-xpts valgrind-new:0.07s no: 0.6s ( 9.1x, -----) me: 3.1s (44.6x, -----) ca:371.7s (5310.3x, -----) he: 6.5s (93.3x, -----) ca: 2.8s (39.9x, -----) dr: 2.5s (35.4x, -----) ma: 2.6s (37.6x, -----) many-xpts valgrind-old:0.07s no: 0.6s ( 9.1x, 0.0%) me: 3.1s (44.4x, 0.3%) ca:374.9s (5356.4x, -0.9%) he: 6.5s (93.3x, 0.0%) ca: 2.8s (39.7x, 0.4%) dr: 2.6s (37.7x, -6.5%) ma: 2.6s (37.6x, 0.0%) -- sarp -- sarp valgrind-new:0.03s no: 0.6s (19.3x, -----) me: 3.4s (114.0x, -----) ca: 3.2s (105.3x, -----) he:16.6s (552.0x, -----) ca: 2.0s (68.3x, -----) dr: 1.4s (45.0x, -----) ma: 0.5s (16.0x, -----) sarp valgrind-old:0.03s no: 0.6s (19.3x, 0.0%) me: 3.4s (114.0x, 0.0%) ca: 3.2s (107.0x, -1.6%) he:16.2s (541.0x, 2.0%) ca: 2.0s (68.3x, 0.0%) dr: 1.7s (56.0x,-24.4%) ma: 0.5s (16.0x, 0.0%) -- tinycc -- tinycc valgrind-new:0.22s no: 2.8s (12.9x, -----) me:14.4s (65.6x, -----) ca:30.0s (136.2x, -----) he:27.7s (125.9x, -----) ca:21.3s (97.0x, -----) dr:20.7s (94.0x, -----) ma: 4.0s (18.1x, -----) tinycc valgrind-old:0.22s no: 2.9s (13.0x, -0.7%) me:14.6s (66.3x, -1.0%) ca:30.0s (136.3x, -0.0%) he:27.8s (126.3x, -0.3%) ca:21.3s (96.9x, 0.1%) dr:22.2s (101.1x, -7.6%) ma: 4.0s (18.2x, -0.5%) -- Finished tests in perf ---------------------------------------------- == 11 programs, 154 timings ================= real 110m49.915s user 109m56.088s sys 0m43.543s |
|
From: Tom H. <to...@co...> - 2014-06-13 03:32:35
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (GCC) 4.3.0 20080428 (Red Hat 4.3.0-8) GDB: Assembler: GNU assembler version 2.18.50.0.6-2 20080403 C library: GNU C Library stable release version 2.8 uname -mrs: Linux 3.13.10-200.fc20.x86_64 x86_64 Vendor version: Fedora release 9 (Sulphur) Nightly build on bristol ( x86_64, Fedora 9 ) Started at 2014-06-13 03:51:11 BST Ended at 2014-06-13 04:32:17 BST Results unchanged from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 650 tests, 2 stderr failures, 1 stdout failure, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/amd64/insn-pcmpistri (stderr) memcheck/tests/err_disable4 (stderr) none/tests/amd64/sse4-64 (stdout) |
|
From: Rich C. <rc...@wi...> - 2014-06-13 03:05:16
|
valgrind revision: 14027
VEX revision: 2874
C compiler: gcc (SUSE Linux) 4.7.2 20130108 [gcc-4_7-branch revision 195012]
GDB: GNU gdb (GDB) SUSE (7.5.1-2.1.1)
Assembler: GNU assembler (GNU Binutils; openSUSE 12.3) 2.23.1
C library: GNU C Library (GNU libc) stable release version 2.17 (git c758a6861537)
uname -mrs: Linux 3.7.9-1.1-desktop x86_64
Vendor version: Welcome to openSUSE 12.3 "Dartmouth" Beta 1 - Kernel %r (%t).
Nightly build on ultra ( gcc (SUSE Linux) 4.7.2 20130108 [gcc-4_7-branch revision 195012] Linux 3.7.9-1.1-desktop x86_64 )
Started at 2014-06-12 21:30:01 CDT
Ended at 2014-06-12 22:05:05 CDT
Results unchanged from 24 hours ago
Checking out valgrind source tree ... done
Configuring valgrind ... done
Building valgrind ... done
Running regression tests ... failed
Regression test results follow
== 675 tests, 0 stderr failures, 0 stdout failures, 2 stderrB failures, 0 stdoutB failures, 0 post failures ==
gdbserver_tests/hginfo (stderrB)
gdbserver_tests/mssnapshot (stderrB)
=================================================
./valgrind-new/gdbserver_tests/hginfo.stderrB.diff
=================================================
--- hginfo.stderrB.exp 2014-06-12 21:48:03.843162514 -0500
+++ hginfo.stderrB.out 2014-06-12 21:53:26.786329105 -0500
@@ -1,5 +1,11 @@
relaying data between gdb and process ....
+Missing separate debuginfo for /lib64/ld-linux-x86-64.so.2
+Try: zypper install -C "debuginfo(build-id)=ecb8ef1a6904a2a3ec60a527f415f520c8636158"
vgdb-error value changed from 0 to 999999
+Missing separate debuginfo for /lib64/libpthread.so.0
+Try: zypper install -C "debuginfo(build-id)=ef5f5dbcb2398c608fef7884e1bfb65be3b5f0ef"
+Missing separate debuginfo for /lib64/libc.so.6
+Try: zypper install -C "debuginfo(build-id)=bd1473e8e6a4c10a14731b5be4b35b4e87db2af7"
Lock ga 0x........ {
Address 0x........ is 0 bytes inside data symbol "mx"
kind mbRec
=================================================
./valgrind-new/gdbserver_tests/mssnapshot.stderrB.diff
=================================================
--- mssnapshot.stderrB.exp 2014-06-12 21:48:03.847162405 -0500
+++ mssnapshot.stderrB.out 2014-06-12 21:53:59.289444641 -0500
@@ -1,5 +1,11 @@
relaying data between gdb and process ....
+Missing separate debuginfo for /lib64/ld-linux-x86-64.so.2
+Try: zypper install -C "debuginfo(build-id)=ecb8ef1a6904a2a3ec60a527f415f520c8636158"
vgdb-error value changed from 0 to 999999
+Missing separate debuginfo for /lib64/libpthread.so.0
+Try: zypper install -C "debuginfo(build-id)=ef5f5dbcb2398c608fef7884e1bfb65be3b5f0ef"
+Missing separate debuginfo for /lib64/libc.so.6
+Try: zypper install -C "debuginfo(build-id)=bd1473e8e6a4c10a14731b5be4b35b4e87db2af7"
general valgrind monitor commands:
help [debug] : monitor command help. With debug: + debugging commands
v.wait [<ms>] : sleep <ms> (default 0) then continue
=================================================
./valgrind-old/gdbserver_tests/hginfo.stderrB.diff
=================================================
--- hginfo.stderrB.exp 2014-06-12 21:30:17.264513044 -0500
+++ hginfo.stderrB.out 2014-06-12 21:33:47.837630542 -0500
@@ -1,5 +1,11 @@
relaying data between gdb and process ....
+Missing separate debuginfo for /lib64/ld-linux-x86-64.so.2
+Try: zypper install -C "debuginfo(build-id)=ecb8ef1a6904a2a3ec60a527f415f520c8636158"
vgdb-error value changed from 0 to 999999
+Missing separate debuginfo for /lib64/libpthread.so.0
+Try: zypper install -C "debuginfo(build-id)=ef5f5dbcb2398c608fef7884e1bfb65be3b5f0ef"
+Missing separate debuginfo for /lib64/libc.so.6
+Try: zypper install -C "debuginfo(build-id)=bd1473e8e6a4c10a14731b5be4b35b4e87db2af7"
Lock ga 0x........ {
Address 0x........ is 0 bytes inside data symbol "mx"
kind mbRec
=================================================
./valgrind-old/gdbserver_tests/mssnapshot.stderrB.diff
=================================================
--- mssnapshot.stderrB.exp 2014-06-12 21:30:17.268512932 -0500
+++ mssnapshot.stderrB.out 2014-06-12 21:34:19.653741910 -0500
@@ -1,5 +1,11 @@
relaying data between gdb and process ....
+Missing separate debuginfo for /lib64/ld-linux-x86-64.so.2
+Try: zypper install -C "debuginfo(build-id)=ecb8ef1a6904a2a3ec60a527f415f520c8636158"
vgdb-error value changed from 0 to 999999
+Missing separate debuginfo for /lib64/libpthread.so.0
+Try: zypper install -C "debuginfo(build-id)=ef5f5dbcb2398c608fef7884e1bfb65be3b5f0ef"
+Missing separate debuginfo for /lib64/libc.so.6
+Try: zypper install -C "debuginfo(build-id)=bd1473e8e6a4c10a14731b5be4b35b4e87db2af7"
general valgrind monitor commands:
help [debug] : monitor command help. With debug: + debugging commands
v.wait [<ms>] : sleep <ms> (default 0) then continue
|
|
From: Tom H. <to...@co...> - 2014-06-13 02:42:53
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (GCC) 4.6.3 20120306 (Red Hat 4.6.3-2) GDB: GNU gdb (GDB) Fedora (7.3.50.20110722-16.fc16) Assembler: GNU assembler version 2.21.53.0.1-6.fc16 20110716 C library: GNU C Library development release version 2.14.90 uname -mrs: Linux 3.13.10-200.fc20.x86_64 x86_64 Vendor version: Fedora release 16 (Verne) Nightly build on bristol ( x86_64, Fedora 16 ) Started at 2014-06-13 03:01:33 BST Ended at 2014-06-13 03:42:41 BST Results unchanged from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 682 tests, 1 stderr failure, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/err_disable4 (stderr) |
|
From: Tom H. <to...@co...> - 2014-06-13 02:30:41
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (GCC) 4.7.2 20120921 (Red Hat 4.7.2-2) GDB: GNU gdb (GDB) Fedora (7.4.50.20120120-54.fc17) Assembler: GNU assembler version 2.22.52.0.1-10.fc17 20120131 C library: GNU C Library stable release version 2.15 uname -mrs: Linux 3.13.10-200.fc20.x86_64 x86_64 Vendor version: Fedora release 17 (Beefy Miracle) Nightly build on bristol ( x86_64, Fedora 17 (Beefy Miracle) ) Started at 2014-06-13 02:51:07 BST Ended at 2014-06-13 03:30:24 BST Results differ from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 682 tests, 5 stderr failures, 1 stdout failure, 0 stderrB failures, 0 stdoutB failures, 0 post failures == gdbserver_tests/mcinfcallRU (stderr) gdbserver_tests/mcinfcallWSRU (stderr) gdbserver_tests/mcmain_pic (stderr) memcheck/tests/err_disable4 (stderr) exp-sgcheck/tests/preen_invars (stdout) exp-sgcheck/tests/preen_invars (stderr) ================================================= == Results from 24 hours ago == ================================================= Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 682 tests, 6 stderr failures, 2 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == gdbserver_tests/mcinfcallRU (stderr) gdbserver_tests/mcinfcallWSRU (stderr) gdbserver_tests/mcmain_pic (stderr) memcheck/tests/err_disable4 (stderr) none/tests/fdleak_ipv4 (stdout) none/tests/fdleak_ipv4 (stderr) exp-sgcheck/tests/preen_invars (stdout) exp-sgcheck/tests/preen_invars (stderr) ================================================= == Difference between 24 hours ago and now == ================================================= *** old.short 2014-06-13 03:10:31.061981125 +0100 --- new.short 2014-06-13 03:30:24.797159623 +0100 *************** *** 8,10 **** ! == 682 tests, 6 stderr failures, 2 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == gdbserver_tests/mcinfcallRU (stderr) --- 8,10 ---- ! == 682 tests, 5 stderr failures, 1 stdout failure, 0 stderrB failures, 0 stdoutB failures, 0 post failures == gdbserver_tests/mcinfcallRU (stderr) *************** *** 13,16 **** memcheck/tests/err_disable4 (stderr) - none/tests/fdleak_ipv4 (stdout) - none/tests/fdleak_ipv4 (stderr) exp-sgcheck/tests/preen_invars (stdout) --- 13,14 ---- |
|
From: Rich C. <rc...@wi...> - 2014-06-13 02:29:51
|
valgrind revision: 14027
VEX revision: 2874
C compiler: gcc (SUSE Linux) 4.8.1 20130909 [gcc-4_8-branch revision 202388]
GDB: GNU gdb (GDB; openSUSE Factory) 7.6.50.20130731-cvs
Assembler: GNU assembler (GNU Binutils; openSUSE Factory) 2.23.2
C library: GNU C Library (GNU libc) stable release version 2.18 (git )
uname -mrs: Linux 3.11.4-3-desktop x86_64
Vendor version: Welcome to openSUSE 13.1 "Bottle" Beta 1 - Kernel %r (%t).
Nightly build on rodan ( Linux 3.11.4-3-desktop x86_64 )
Started at 2014-06-12 19:22:01 CDT
Ended at 2014-06-12 21:29:40 CDT
Results differ from 24 hours ago
Checking out valgrind source tree ... done
Configuring valgrind ... done
Building valgrind ... done
Running regression tests ... failed
Regression test results follow
== 597 tests, 5 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures ==
memcheck/tests/err_disable3 (stderr)
memcheck/tests/err_disable4 (stderr)
memcheck/tests/threadname (stderr)
memcheck/tests/threadname_xml (stderr)
exp-sgcheck/tests/hackedbz2 (stderr)
=================================================
== Results from 24 hours ago ==
=================================================
Checking out valgrind source tree ... done
Configuring valgrind ... done
Building valgrind ... done
Running regression tests ... failed
Regression test results follow
== 596 tests, 5 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures ==
memcheck/tests/err_disable3 (stderr)
memcheck/tests/err_disable4 (stderr)
memcheck/tests/threadname (stderr)
memcheck/tests/threadname_xml (stderr)
exp-sgcheck/tests/hackedbz2 (stderr)
=================================================
== Difference between 24 hours ago and now ==
=================================================
*** old.short Thu Jun 12 20:26:27 2014
--- new.short Thu Jun 12 21:29:40 2014
***************
*** 8,10 ****
! == 596 tests, 5 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures ==
memcheck/tests/err_disable3 (stderr)
--- 8,10 ----
! == 597 tests, 5 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures ==
memcheck/tests/err_disable3 (stderr)
=================================================
./valgrind-new/exp-sgcheck/tests/hackedbz2.stderr.diff-glibc28-amd64
=================================================
--- hackedbz2.stderr.exp-glibc28-amd64 2014-06-12 20:26:55.762197464 -0500
+++ hackedbz2.stderr.out 2014-06-12 21:28:20.526942403 -0500
@@ -1,7 +1,6 @@
Invalid read of size 1
- at 0x........: vex_strlen (hackedbz2.c:1006)
- by 0x........: add_to_myprintf_buf (hackedbz2.c:1284)
+ at 0x........: add_to_myprintf_buf (hackedbz2.c:1006)
by 0x........: vex_printf (hackedbz2.c:1155)
by 0x........: BZ2_compressBlock (hackedbz2.c:4039)
by 0x........: handle_compress (hackedbz2.c:4761)
=================================================
./valgrind-new/memcheck/tests/err_disable3.stderr.diff
=================================================
--- err_disable3.stderr.exp 2014-06-12 20:26:54.178179088 -0500
+++ err_disable3.stderr.out 2014-06-12 20:45:27.885098593 -0500
@@ -10,8 +10,6 @@
Thread 2:
Invalid read of size 1
at 0x........: err (err_disable3.c:25)
- by 0x........: child_fn (err_disable3.c:31)
- ...
Address 0x........ is 5 bytes inside a block of size 10 free'd
at 0x........: free (vg_replace_malloc.c:...)
by 0x........: main (err_disable3.c:42)
=================================================
./valgrind-new/memcheck/tests/err_disable4.stderr.diff
=================================================
--- err_disable4.stderr.exp 2014-06-12 20:26:52.201156154 -0500
+++ err_disable4.stderr.out 2014-06-12 20:45:32.398150946 -0500
@@ -1501,8 +1501,6 @@
Thread x:
Invalid read of size 1
at 0x........: err (err_disable4.c:41)
- by 0x........: child_fn_2 (err_disable4.c:55)
- ...
Address 0x........ is 5 bytes inside a block of size 10 free'd
at 0x........: free (vg_replace_malloc.c:...)
by 0x........: main (err_disable4.c:68)
=================================================
./valgrind-new/memcheck/tests/threadname.stderr.diff
=================================================
--- threadname.stderr.exp 2014-06-12 20:26:52.165155736 -0500
+++ threadname.stderr.out 2014-06-12 20:51:28.767284990 -0500
@@ -9,36 +9,12 @@
Thread 2:
Invalid write of size 1
at 0x........: bad_things (threadname.c:16)
- by 0x........: child_fn_0 (threadname.c:53)
- ...
Address 0x........ is 0 bytes after a block of size 2 alloc'd
at 0x........: malloc (vg_replace_malloc.c:...)
by 0x........: bad_things (threadname.c:15)
by 0x........: child_fn_0 (threadname.c:53)
...
-Thread 3 try1:
-Invalid write of size 1
- at 0x........: bad_things (threadname.c:16)
- by 0x........: child_fn_1 (threadname.c:38)
- ...
- Address 0x........ is 0 bytes after a block of size 3 alloc'd
- at 0x........: malloc (vg_replace_malloc.c:...)
- by 0x........: bad_things (threadname.c:15)
- by 0x........: child_fn_1 (threadname.c:38)
- ...
-
-Thread 4 012345678901234:
-Invalid write of size 1
- at 0x........: bad_things (threadname.c:16)
- by 0x........: child_fn_2 (threadname.c:26)
- ...
- Address 0x........ is 0 bytes after a block of size 4 alloc'd
- at 0x........: malloc (vg_replace_malloc.c:...)
- by 0x........: bad_things (threadname.c:15)
- by 0x........: child_fn_2 (threadname.c:26)
- ...
-
Thread 1:
Invalid write of size 1
at 0x........: bad_things (threadname.c:16)
=================================================
./valgrind-new/memcheck/tests/threadname_xml.stderr.diff
=================================================
--- threadname_xml.stderr.exp 2014-06-12 20:26:52.163155713 -0500
+++ threadname_xml.stderr.out 2014-06-12 20:51:30.812308713 -0500
@@ -94,14 +94,6 @@
<file>threadname.c</file>
<line>...</line>
</frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_0</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
</stack>
<auxwhat>Address 0x........ is 0 bytes after a block of size 2 alloc'd</auxwhat>
<stack>
@@ -135,112 +127,6 @@
<error>
<unique>0x........</unique>
<tid>...</tid>
- <threadname>try1</threadname>
- <kind>InvalidWrite</kind>
- <what>Invalid write of size 1</what>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>bad_things</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_1</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- </stack>
- <auxwhat>Address 0x........ is 0 bytes after a block of size 3 alloc'd</auxwhat>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>malloc</fn>
- <dir>...</dir>
- <file>vg_replace_malloc.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>bad_things</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_1</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- </stack>
-</error>
-
-<error>
- <unique>0x........</unique>
- <tid>...</tid>
- <threadname>012345678901234</threadname>
- <kind>InvalidWrite</kind>
- <what>Invalid write of size 1</what>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>bad_things</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_2</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- </stack>
- <auxwhat>Address 0x........ is 0 bytes after a block of size 4 alloc'd</auxwhat>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
<truncated beyond 100 lines>
=================================================
./valgrind-old/exp-sgcheck/tests/hackedbz2.stderr.diff-glibc28-amd64
=================================================
--- hackedbz2.stderr.exp-glibc28-amd64 2014-06-12 19:23:57.709370340 -0500
+++ hackedbz2.stderr.out 2014-06-12 20:25:08.539953637 -0500
@@ -1,7 +1,6 @@
Invalid read of size 1
- at 0x........: vex_strlen (hackedbz2.c:1006)
- by 0x........: add_to_myprintf_buf (hackedbz2.c:1284)
+ at 0x........: add_to_myprintf_buf (hackedbz2.c:1006)
by 0x........: vex_printf (hackedbz2.c:1155)
by 0x........: BZ2_compressBlock (hackedbz2.c:4039)
by 0x........: handle_compress (hackedbz2.c:4761)
=================================================
./valgrind-old/memcheck/tests/err_disable3.stderr.diff
=================================================
--- err_disable3.stderr.exp 2014-06-12 19:22:44.595522186 -0500
+++ err_disable3.stderr.out 2014-06-12 19:42:41.327404818 -0500
@@ -10,8 +10,6 @@
Thread 2:
Invalid read of size 1
at 0x........: err (err_disable3.c:25)
- by 0x........: child_fn (err_disable3.c:31)
- ...
Address 0x........ is 5 bytes inside a block of size 10 free'd
at 0x........: free (vg_replace_malloc.c:...)
by 0x........: main (err_disable3.c:42)
=================================================
./valgrind-old/memcheck/tests/err_disable4.stderr.diff
=================================================
--- err_disable4.stderr.exp 2014-06-12 19:22:46.786547603 -0500
+++ err_disable4.stderr.out 2014-06-12 19:42:45.743456046 -0500
@@ -1501,8 +1501,6 @@
Thread x:
Invalid read of size 1
at 0x........: err (err_disable4.c:41)
- by 0x........: child_fn_2 (err_disable4.c:55)
- ...
Address 0x........ is 5 bytes inside a block of size 10 free'd
at 0x........: free (vg_replace_malloc.c:...)
by 0x........: main (err_disable4.c:68)
=================================================
./valgrind-old/memcheck/tests/threadname.stderr.diff
=================================================
--- threadname.stderr.exp 2014-06-12 19:22:46.749547174 -0500
+++ threadname.stderr.out 2014-06-12 19:48:41.122578605 -0500
@@ -9,36 +9,12 @@
Thread 2:
Invalid write of size 1
at 0x........: bad_things (threadname.c:16)
- by 0x........: child_fn_0 (threadname.c:53)
- ...
Address 0x........ is 0 bytes after a block of size 2 alloc'd
at 0x........: malloc (vg_replace_malloc.c:...)
by 0x........: bad_things (threadname.c:15)
by 0x........: child_fn_0 (threadname.c:53)
...
-Thread 3 try1:
-Invalid write of size 1
- at 0x........: bad_things (threadname.c:16)
- by 0x........: child_fn_1 (threadname.c:38)
- ...
- Address 0x........ is 0 bytes after a block of size 3 alloc'd
- at 0x........: malloc (vg_replace_malloc.c:...)
- by 0x........: bad_things (threadname.c:15)
- by 0x........: child_fn_1 (threadname.c:38)
- ...
-
-Thread 4 012345678901234:
-Invalid write of size 1
- at 0x........: bad_things (threadname.c:16)
- by 0x........: child_fn_2 (threadname.c:26)
- ...
- Address 0x........ is 0 bytes after a block of size 4 alloc'd
- at 0x........: malloc (vg_replace_malloc.c:...)
- by 0x........: bad_things (threadname.c:15)
- by 0x........: child_fn_2 (threadname.c:26)
- ...
-
Thread 1:
Invalid write of size 1
at 0x........: bad_things (threadname.c:16)
=================================================
./valgrind-old/memcheck/tests/threadname_xml.stderr.diff
=================================================
--- threadname_xml.stderr.exp 2014-06-12 19:22:46.748547162 -0500
+++ threadname_xml.stderr.out 2014-06-12 19:48:43.175602421 -0500
@@ -94,14 +94,6 @@
<file>threadname.c</file>
<line>...</line>
</frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_0</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
</stack>
<auxwhat>Address 0x........ is 0 bytes after a block of size 2 alloc'd</auxwhat>
<stack>
@@ -135,112 +127,6 @@
<error>
<unique>0x........</unique>
<tid>...</tid>
- <threadname>try1</threadname>
- <kind>InvalidWrite</kind>
- <what>Invalid write of size 1</what>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>bad_things</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_1</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- </stack>
- <auxwhat>Address 0x........ is 0 bytes after a block of size 3 alloc'd</auxwhat>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>malloc</fn>
- <dir>...</dir>
- <file>vg_replace_malloc.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>bad_things</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_1</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- </stack>
-</error>
-
-<error>
- <unique>0x........</unique>
- <tid>...</tid>
- <threadname>012345678901234</threadname>
- <kind>InvalidWrite</kind>
- <what>Invalid write of size 1</what>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>bad_things</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
- <fn>child_fn_2</fn>
- <dir>...</dir>
- <file>threadname.c</file>
- <line>...</line>
- </frame>
- </stack>
- <auxwhat>Address 0x........ is 0 bytes after a block of size 4 alloc'd</auxwhat>
- <stack>
- <frame>
- <ip>0x........</ip>
- <obj>...</obj>
<truncated beyond 100 lines>
|
|
From: Tom H. <to...@co...> - 2014-06-13 02:22:32
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (GCC) 4.7.2 20121109 (Red Hat 4.7.2-8) GDB: GNU gdb (GDB) Fedora 7.5.1-42.fc18 Assembler: GNU assembler version 2.23.51.0.1-10.fc18 20120806 C library: GNU C Library stable release version 2.16 uname -mrs: Linux 3.13.10-200.fc20.x86_64 x86_64 Vendor version: Fedora release 18 (Spherical Cow) Nightly build on bristol ( x86_64, Fedora 18 (Spherical Cow) ) Started at 2014-06-13 02:41:15 BST Ended at 2014-06-13 03:22:17 BST Results unchanged from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 682 tests, 2 stderr failures, 1 stdout failure, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/err_disable4 (stderr) exp-sgcheck/tests/preen_invars (stdout) exp-sgcheck/tests/preen_invars (stderr) |
|
From: Tom H. <to...@co...> - 2014-06-13 02:10:58
|
valgrind revision: 14027 VEX revision: 2874 C compiler: gcc (GCC) 4.8.2 20131212 (Red Hat 4.8.2-7) GDB: GNU gdb (GDB) Fedora 7.6.1-46.fc19 Assembler: GNU assembler version 2.23.52.0.1-9.fc19 20130226 C library: GNU C Library (GNU libc) stable release version 2.17 uname -mrs: Linux 3.13.10-200.fc20.x86_64 x86_64 Vendor version: Fedora release 19 (Schrödingerâs Cat) Nightly build on bristol ( x86_64, Fedora 19 (Schrödingerâs Cat) ) Started at 2014-06-13 02:31:29 BST Ended at 2014-06-13 03:10:43 BST Results differ from 24 hours ago Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 682 tests, 3 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/err_disable4 (stderr) none/tests/fdleak_ipv4 (stderr) exp-sgcheck/tests/hackedbz2 (stderr) ================================================= == Results from 24 hours ago == ================================================= Checking out valgrind source tree ... done Configuring valgrind ... done Building valgrind ... done Running regression tests ... failed Regression test results follow == 682 tests, 2 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/err_disable4 (stderr) exp-sgcheck/tests/hackedbz2 (stderr) ================================================= == Difference between 24 hours ago and now == ================================================= *** old.short 2014-06-13 02:51:01.344693222 +0100 --- new.short 2014-06-13 03:10:43.127737976 +0100 *************** *** 8,11 **** ! == 682 tests, 2 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/err_disable4 (stderr) exp-sgcheck/tests/hackedbz2 (stderr) --- 8,12 ---- ! == 682 tests, 3 stderr failures, 0 stdout failures, 0 stderrB failures, 0 stdoutB failures, 0 post failures == memcheck/tests/err_disable4 (stderr) + none/tests/fdleak_ipv4 (stderr) exp-sgcheck/tests/hackedbz2 (stderr) |
|
From: Siddharth N. <sn...@dr...> - 2014-06-13 01:22:23
|
Hi All, I did a modification of the Callgrind tool to measure the amount of bytes communicated between functions. To do this, I used a shadow memory similar to what was described in the "How to shadow every byte" paper. As my secondary maps are quite huge (sometimes up to 2.5MB), the memory usage of the tool becomes sky high, especially for memory-intensive applications. I am keeping track of all my VG_mallocs and I periodically calculate the heap usage. However, I noticed that for some memory-intensive benchmarks, that the VIRT reported by Linux is much larger than my estimate of memory. For example, in one scenario, I report 18Gb of allocated memory, while the VIRT report by "top" shows 32Gb. The tool stops running on the next memory allocation request and Valgrind says that it cannot allocate anymore memory. I checked my calculation of heap usage several times and I believe its correct. I tried running the tool under Memcheck itself to see if there are any leaks, but Memcheck does not seem to report anything wrong. In fact, judging by the numbers I see, I'm not sure Memcheck is seeing a lot of allocations requested by my tool. Any theories on why this may be happening? Is there a way I can periodically check if the number of new virtual pages requested during a malloc is greater than the actual malloc'ed memory? Thanks, Sid |