I am practicing algorithm and data structures. Therefore, I keep profiling my programs. Here is an example output from gprof:
Each sample counts as 0.01 seconds.
% cumulative self self total
time seconds seconds calls ns/call ns/call name
29.76 2.58 2.58 _IO_vfscanf_internal
14.30 3.82 1.24 ____strtof_l_internal
11.71 4.84 1.02 memcpy
7.84 5.52 0.68 8400000 80.95 80.95 insertorupdate
5.94 6.03 0.52 _int_malloc
5.77 6.53 0.50 round_and_return
3.81 6.86 0.33 _IO_sputbackc
3.23 7.14 0.28 strlen
2.88 7.39 0.25 __strcmp_ia32
2.13 7.58 0.19 __isoc99_scanf
2.13 7.76 0.19 fgets
2.02 7.94 0.18 _IO_getline_info
1.27 8.05 0.11 __mpn_lshift
1.21 8.15 0.11 __memchr_sse2_bsf
0.87 8.23 0.08 malloc
... rest ...
But I found it hard to understand what this means. For example, my program takes a lot of time in standard functions, I guess it should be good, right? Because, if one of my functions were consuming most of the time, that could mean I was doing something wrong. But as I say, I am not sure if I am interpreting this correctly, so any suggesting or pointers reading this is welcome.