Path: utzoo!attcan!uunet!lll-winken!uwm.edu!gem.mps.ohio-state.edu!apple!mikes From: mikes@Apple.COM (Mike Shannon) Newsgroups: comp.lang.c Subject: Re: Heap Fragmentation Message-ID: <35076@apple.Apple.COM> Date: 27 Sep 89 23:49:28 GMT References: <1989Sep27.045926.12634@polyslo.CalPoly.EDU> Distribution: na Organization: Apple Computer Inc, Cupertino, CA Lines: 24 In article <1989Sep27.045926.12634@polyslo.CalPoly.EDU> ttwang@polyslo.CalPoly.EDU (Thomas Wang) writes: >Is heap fragmentation in C a problem or non-problem? Memory management is not part of the C language, it is part of the library support in the underlying system. When I last heard, there is an effective scheme for re-using de-allocated memory, which uses hashing. It's my understanding that de-allocated memory chunks are never handed back to the operating system, but are kept in an array of queues where the chunk size doubles from one queue to the next. So, when you do a malloc(xxx), the queue which has a block bigger than or equal to xxx in size is searched, and a block is broken in two and you get part of it. This means you don't get the overhead of a system call. I personally have never had a problem with heap fragmentation under UNIX. For performance reasons, if I am going to be allocating and de-allocating memory of a single fixed size, I often maintain my own 'free' lists and do my own allocation which checks my free list first before calling malloc(). In summary, I would say heap fragementation under UNIX is not a problem. -- Michael Shannon {apple!mikes, mikes@apple.com}