BackupPC-users

Re: [BackupPC-users] BackupPC_dump memory usage

2013-03-12 22:45:15
Subject: Re: [BackupPC-users] BackupPC_dump memory usage
From: Holger Parplies <wbppc AT parplies DOT de>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Wed, 13 Mar 2013 03:43:43 +0100
Hi,

Arnold Krille wrote on 2013-03-12 21:48:18 +0100 [Re: [BackupPC-users] 
BackupPC_dump memory usage]:
> [...]
> Not again.
> 
> Please people, come join us in the 21st century. 64bit has been around
> long enough. If there is still an app that fails to compile/run on
> 64bits, its worthy of dropping it entirely.

not again. If there's still a person around that fails to realize that it's
not his job to decide what other people should do (partly but not only because
he can't judge their requirements and constraints), it's worth dropping him
entirely.

> If something basic like perl would still make problems on 64bits, there
> would be a legion of programmers fixing it.

Yes, undoubtedly. If, though, you are saying that people care much about
memory consumption (i.e. using *too much* memory), then please come join us in
the 21st century. My *telephone* has problems running on 768 MB RAM with a 1
GHz processor. I can well remember running an X11 Server on a machine with 4
MB RAM and a swap*file* on a 3.5" FDD. It was more a proof-of-concept, and it
didn't exactly run smoothly, but it did run. Processing hasn't become more
difficult since those times. The same task doesn't per se require more memory
and a faster processor to get it done now than it did then. But programming
paradigms have changed. Tasks have changed. Who would honestly even consider
using 8 bits colour depth these days (even on a telephone)? Back then, we were
excited to have *colour*.
I don't see that Perl using more memory - within reasonable limits - on 64-bit
architectures would worry anyone. It's to be expected. Memory leaks would get
fixed, yes. Instability would get fixed, yes. But more memory usage hardly
sounds exciting. Perl isn't a tool that *typically* uses great amounts of
memory.

All of that said, I repeat that we'd have heard on this list if
BackupPC/File::RsyncP used unreasonable amounts of memory on 64-bit Linux. I
still don't believe the problem is what it seems to be. Not if it's really
only "millions of files".

> There is no problem running any contemporary app on any contemporary
> distribution on a 64bit processor and system. These apps also don't
> take up more memory, they get a very little bit bigger because
> memory-addresses are now 64bits instead of 32bits. But they should be
> using position-independant-code anyways.

In what way does PIC make *memory* addresses shorter? Are you suggesting we
re-introduce the concept of near-pointers and far-pointers? *shiver*

You do realize, though, that the main point of a 64-bit architecture is to
have a 64-bit ALU, right? And that correctly aligning memory accesses becomes
increasingly important?

Which "apps" exactly have you analyzed and found to not "take up more memory"?

> And now an app can also use more then 3Gbyte of ram. And your system can
> also have more than 3.2GB of ram without ugly clutches like PAE.

What exactly about PAE do you consider an ugly kludge?

> Nowadays 64bit isn't an exotic exception. I had trouble last week
> installing a box: Took me one attempt to realized that my hw-people
> gave me a 32-bit-only machine!

Welcome to the real world.

Regards,
Holger

------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_mar
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/