Path: utzoo!utgpu!jarvis.csri.toronto.edu!rutgers!cs.utexas.edu!usc!ucsd!chem.ucsd.edu!tps
From: tps@chem.ucsd.edu (Tom Stockfisch)
Newsgroups: comp.lang.c
Subject: Re: Heap Fragmentation
Message-ID: <575@chem.ucsd.EDU>
Date: 30 Sep 89 05:55:11 GMT
References: <1989Sep27.045926.12634@polyslo.CalPoly.EDU> <35076@apple.Apple.COM> <11161@smoke.BRL.MIL>
Reply-To: tps@chem.ucsd.edu (Tom Stockfisch)
Organization: Chemistry Dept, UC San Diego
Lines: 45

In article <11161@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn) writes:

>>In article <1989Sep27.045926.12634@polyslo.CalPoly.EDU> ttwang@polyslo.CalPoly.EDU (Thomas Wang) writes:
>>>Is heap fragmentation in C a problem or non-problem?

>>In summary, I would say heap fragementation under UNIX is not a problem.

>It can be, but unless you're pushing the limits on maximum virtual
>data space size it would take a rather unusual memory allocator usage
>pattern.

I would disagree, at least with the Berkeley malloc package.  It can easily
waste a factor of 4 or 5.  The following excerpt is certainly not
unusual, yet malloc gets 45984 bytes from the operating system to
satisfy an 8192 byte need:


	int	size =	1;
	char	*p =	malloc(size);

	while (size <= 8192)
	{
		p =	realloc( p, size );
		size *=	2;
	}

I find my programs that need a significant amount of memory 
generally consume 5-8 times as much memory as they
need theoretically.

The Berkeley malloc has, essentially, a free list
for each size request, after adding 8 bytes of overhead and rounding up
to the nearest power of two with an 8 byte minimum.

The "worst case" behavior for this,
assuming an actual need of MAX, requires roughly

        MAX*(log (MAX) - 2)
                2

For instance, a program that needed only 8 meg might have malloc
asking for 320 meg from the operating system !!
-- 

|| Tom Stockfisch, UCSD Chemistry	tps@chem.ucsd.edu