Appaling Security Perception in Malaysia and surrounding region

I am totally appalled by the knowledge of IT security in Malaysia, the country where yours truly hail from. Some people just have some of the most ridiculous perception of security and hacking you could imagine. And I am talking about people who are actually in IT industry or related industry. They range from CEOs to system developers to typical users. Below are some of the example of real life cases about misconception of security that I have experienced. There are more actually but some I have already forgotten. Bear in mind, names has been changed in order to protect the anonymity of that security idiot and so that I do not get sued.

Case 1:

This case involve one of the client’s server that I am managing. We are charged to maintain the server to make sure it is running fine and that websites hosted in this server run properly. This is a big company that can certain afford to pay for 1 server to host 1 website. A website that is not that busy. This is good way to maintain total control and P&C related material of the company. When I took over to manage this server, I found out they actually paid a pentesting company to do some pentest on the server every half yearly. I wasn’t told about this when I took over the server so I did not update any packages in this Linux server. Naturally, my style of “if it ain’t broken, don’t fix it” has been one of the motto I practice for many years which so far proved to be an invaluable thing.

Imagine, one day, suddenly I receive an email from this client screaming about why their server has so many vulnerabilities. How can we let hackers a free pass into their server. Well they did not specifically said that but from their tone it certainly sound like that. There are actually about 5 vulnerabilities and some are even irrelevant. Such as usage of mod_proxy which is a module in Apache that is enable but was not actually used in the website. Naturally, a few are harder to tackle such as updating the software packages like Apache and PHP to a very latest version. Any linux system administrators would know that to update a package to the latest and most edge version you need to compile the package manually. Any linux administrators would know compiling such packages would be a PITA and it will certainly break the package updater that comes with the OS such at apt-get (ubuntu/debian), yum (redhat/fedora/centos) or yast (opensuse). So went to talk to client about this. Even spoke to the pentester. Client wanted industry PCI compliance. So no choice but to upgrade. I then manage to locate compiled package without needing to compile. Upgrade relevant packages and few days later, pentester said now this version have vulnerability. Have to upgrade again. Did some talk with client but their CEO wanted to be PCI compliance. I then argued that there is no point in having that server PCI compliant because it does not doe e-commerce and does not store any P&C related materials. Worst case scenario is hacker deface the website. There are no information for them to steal. Eventually client agreed that this server does not need PCI compliance. Whew…luckily client understands. Cause I did pull in a few more big words site down, compatibility issues and so on.

The thing to be learn in this lesson is that clients always wanted some security thingie without knowing the reason. A lot of the pentester results are also very insignificant but they took it so seriously without knowing why.

Case 2:

Not too long ago a big company got hacked. A few of their website in some country got hacked along with their biggest online portal. Needless to say, they are ashamed of this and started to panic. They then implemented some of the most ridiculous things one could imagine. Luckily we are not really affected because we completed some of our projects with them. There are few more remaining project that is still on-going.

Before this hacking problem occured, I already have some problem with them. Their website domain is controlled by someone in neighboring country. Supposedly a country with lots of talents. However, this person do not even know how to handle proper creation of subdomain and A records. As we deal with their counterparts in local office, even the people in Malaysia are unable to handle this and they are unwilling to be responsible for it. It’s just a freaking subdomain creation with A records pointing. So I know these people are pretty much useless. A multi-national company WORLDWIDE with these people in their IT department. All this while they rely heavily on vendors to support them and to blame whenever problem happens.

Then this hacking issue happened. They had to implement lots of security policies which they have been slacking all this while. They started doing some pentest on some of the sites hosted in our server. These sites are temporary projects which is running their campaign. Again, ended up with some ridiculous vulnerability such as not turning off php’s default “about” page. Quickly they started pointing fingers. Like asking us, why we let hackers walk freely in our servers and specifically their website. It’s almost like they want to sue us for having servers vulnerabilities. They actually assumed all servers run by big corporations are strictly 100% hack proof. But the did not realize we are in the part of the world that is called South East Asia.

Later for the past few weeks, slowly they take back all the project and websites and host them in their own server. However, they requested some of the most useless things but I forgot what they are. Some sort of requests that some inefficient “expert” would ask.

Case 3:

Just recently we have this project on programming a web portal with e-commerce using IIS and PHP. This company got the project and sub it out to us. The reason is because they do not know PHP and only knows .NET. So we created the staging server and create the prototype site and so on. It reaches the stage where the codes need to be install in their server. Tey wanted us to go install PHP for them. Wanted to ask also about some security issue. As a veteran system administrator (that is still learning), naturally I requested for remote desktop in the system to install the PHP. They replied us wanting us to instead go there to install. Reason is because fear of security issue. I am like WTF, everyone is using some sort of remote desktop to manage remote servers and they have no problems. They think hackers are at their server door step waiting them to open a port. Once they open and it’s yeah..free server to hack…yeah. Stupid. Then I said they can set it such that the server only allow our IP address which will solve any security issue. Upon replying this, I already have a feeling they do not know how to do this and will reply with some other excuse. To my expectation they replied saying still want to meet up and discuss about this and wanted to do some “risk analysis”. I really am speechless and can’t say much as they are our client.

As usual, main contractors in Malaysia are people who only talk and have contacts. When it comes to real work, these people are useless. Sadly in this part of the world talent alone will not make you successful.


Are you a computer guy?

Well, this is not anything technical. Just a burb about some experience most people can relate living in an Asian country where computers has not  been around too long.

You see, long time ago when computers are getting the buzz. A lot of parents just send their kids to computer studies without knowing anything. Everybody just say computer is the future. Not knowing there are actually a lot of branches to IT. Especially when the internet has become part of the IT world there are even more branches to computer studies and majors. Here in this story, we have a typical IT graduate who is working as Software Engineer. The parents would not have the slightest idea what he does. So he end up being a computer guy to his parents. One day, during a morning coffee gathering like most retired folks will do in my part of the country, one of parent’s friend will ask:

Old friend A: Hey, Ah Kau can you ask your computer son to come take a look at my son’s computer? He has been unable to turn it on for the past few days. Isn’t you son a computer guy? He working in IBM (old folks only know IBM as the computer company), right?

Computer Guy’s Father: Yeah sure. My son is very smart. He has been working in IBM for 5 years already. He is very good with computers. See him working with it everyday. Using this small computer (laptop) that the company GAVE HIM.

Old friend A: Wah…he must be very good. Or else his boss will not give him a small “powderful” computer. I heard it’s very expensive. That’s why I bought the big one for my son.

CG Father: Yeah, yeah…my son earning 5 figure income now. (smirking like any proud parents would of their kids)

The following night at home:

CG Father: Hey, Derrick ah, can you go to Uncle A’s house and take a look at his son’s computer? He say the computer not working for past few days.

Computer Guy: Oh..cannot. I am not a computer technician.

CG Father: Didn’t I send you to study computer? You should know how to fix computer, right? After all, you are working in IBM.

Computer Guy: I only learn basic about computer parts in college. I am working as software engineer now.

CG Father: What is software engineer? You are computer guy. You can handle everything about computer. Help out Uncle A. I told him you working in IBM. IBM is THE computer company, right?

Computer Guy: Software Engineer is a person who design and create software for a computer system..blah…blah..blah….after 5 minutes of explaining.…so that’s what Software Engineer is.

CG Father: Huh? Duh…

Computer Guy don’t know how to respond…and end up being a computer guy fixing computers for uncle A.

I am sure a lot of you older generation of IT guy would be able to relate to this…computer guy thingie…


On-line cloud backup compare – Backblaze vs Crashplan vs Spideroak vs Carbonite vs Mozy

On-line backup solution has been getting more and more popular recently. This is a good idea and I am sure many have thought of such a backup solution long when tapes are pain in the ass to manage. It was PITA then and is PITA now and will always be PITA. I hate tapes. Thanks to the emergence of cloud computing, backing-up valuable data into these service provider is getting easier and not to mention CHEAPER.

I came across such a backup solution a year ago and has been looking into using them. With little time of mine between work and play, I manage to test out a few and I would like to comment on them. Okay, let’s look at each of the backup solution providers one by one. Bear in mind that I have not tried all of them. Some I would just skip due to the price structure being not attractive to me.

 

backblaze logoBackblaze
URL: www.backblaze.com
Supported Platform: Windows, Mac

Comment: This is actually the first on-line backup solution provider that I tested. It was recommended by some review sites and got a lot of praise. Upon testing, I find their backup software to be like what most said. Simple and easy to use. You basically just set it up and then let it run. The backup speed I had with them is also quite decent. No much hiccup and I manage to upload over 300GB of data during their 30 days trial. A good balance between speed and easy of use.

Good: easy to use software, decent upload speed, good pricing

Bad: does not want to install into server OS (Windows Server), lack support for linux

My Rating: 7/10

 

crashplan logoCrashplan
URL: www.crashplan.com
Supported Platform: Windows, Mac, Linux

Comment: Crashplan seems to be quite a comprehensive solution provider. They have plans from most basic home users to multi-level corporate users. So their pricing is very well customized. Their software is also very powerful which have a lot of customization. However, I do get some problems with the software. My data does not get uploaded well when I schedule it to upload only during night time. When I go check the next day, it shows me only around 2 hours of upload time. Even when I change it to upload non-stop 24 hour, it still does not upload as often. Also their software seems to not able to handle uploading files when there are huge data. With the 30 days trial, the software only manage to upload over 100GB of my 356GB data. This is a shame because the software works fine on my home test server (Ubuntu) with 9GB of data. Good thing about the software is that it allow installation on server which can save small companies a lot of cost if they only have 1 storage server that need backing-up. Another good thing about their software is they have extra feature like friends backup and LAN backup. This means your friends can have their backup uploaded to your server and LAN computers can upload their data into your server. From this server, you only need 1 account to upload all the data. This I think is one of the strongest point of crashplan. I also need to mention their speed is not that good from Malaysia. Another reason why my data does not get uploaded as much as Backblaze.

Good: Support all platforms, very powerful software, allow friends and LAN backup into 1 server for single backup solution

Bad: software does not work well with large data, upload speed is slow, software does not upload files often even set to 24 hours backup

My Rating: 8/10

 

spideroak logoSpideroak
URL: www.spideroak.com
Supported Platform: Windows, Mac, Linux

Comment: This is the latest on-line backup system that I am currently testing. So far I quite like them because they are fast. Their website is fast, their backup is fast and their price also quite high. The reason for their fast backup speed is because of the way they handle data compression as stated in their website here. I like the fact that they also support multiple platform which is another plus. Their plan is also simple with only 1 package starting from USD10 per month and you add along as you use more space. The problem is, with this pricing, you can get 2 accounts with other provider with UNLIMITED storage space. They also have a free 2GB account which potential customers can use to test their service. This can be expanded to 50GB with their referral system. Speaking of which, I would really appreciate if the people reading this to use my referral link when signing up for a free account in spideaoak.

Good: Support all platforms, simple and fast software, fast upload

Bad: price, price, price

My Rating: 8/10

 

Honourable Mention for services that I did not test:

carbonite logoCarbonite
URL: www.carbonite.com
Supported Platform: Windows, Mac
Reason for not testing: Price is slightly higher than others. But seems like quite popular in US.

 

mozy logoMozy
URL: mozy.com
Supported Platform: Windows, Mac
Reason for not testing: Price is also higher than others. Have backup plan for servers but even more expensive.

 

If you are going to test out spideroak, please click on my [referral link] for that I get 1GB added into my account. You will also get 1GB added into your free account. Appreciate it very much.


Acer Revo 100 review video by Attack Of The Show

Here is a fun review by AOTS (Attack Of The Show) on Acer Revo 100. AOTS is a very fun TV show to watch about modern lifestyle with tech stuffs. Too bad you can only see this show in US. I used to watch it a lot when they are still showing it in Malaysia. The show seems to lost some of it’s plus points when Olivia Munn left. She did an amazing job when she was around. She is much more tech savvy than the new girl (I forgot her name). As for Sara Underwood (former Playmate), she is just eye candy for the show which majority of the viewers are teenagers and young adults.

httpv://www.youtube.com/watch?v=2zbj55LXNA0


Serviio DLNA media server

Media server is becoming more and more widespread as the need for it arises. Due to the demand, there are a lot of multimedia boxes created for it that allow playing of video, audio and pictures. As these multimedia boxes improve, like Eaget or WDTV and so on, they even integrate possibility to stream medias from the internet such as youtube or PPS. Now, TV manufacturers do not want to be left behind. So, all modern TVs are being build to support media streaming from LAN or Internet. One of the main protocol adopted is the DLNA system. Refer to the Wiki here for more details.

dlna

DLNA is basically some sort of service that provide media streaming to devices that uses it. These devices can be TV, media boxes, ipad and computers. There are a few other formats besides DLNA such as UPnP and so on. Not sure what the main difference but all of them basically allow streaming of medias. I dive in this DLNA when I got myself a Panasonic LED TV. I encountered upon DLNA when I was doing some R&D for buying new TV. After knowing what DLNA is, I immediately wanted to give this thing a try.

In order to allow the DLNA TV or ipad (lots of apps now support DLNA/UPnP) to stream media, you first need a DLNA server. This is synonymous to something like a filesharing server or samba server (for those who are more technically incline). That means installing the DLNA server software in your NAS or server or desktop and so on. There are tons of DLNA servers out there support from Linux to Windows to Macs. Some are commercial and some are freeware. As a person who is used to using opensource software, obviously I will opt for a free version of such product. I have looked into a few and eventually I went into using miniDLNA. This miniDLNA is very simple and easy to setup. However, upon delving deeper into DLNA world, I found it to lack a lot of feature. Especially on streaming with subtitle and transcoding. Transcoding is a term used to describe the capability of DLNA servers to re-encode the video into a format that your TV understand. For example, your TV does not support MKV file format. With transcoding, the DLNA server will re-encode the video on-the-fly into mpeg2 format that your TV support. This is a very powerful feature that make DLNA so useable because users need not to worry about playing files with all sort of formats. The problem with transcoding is that it is still now that well implemented yet. DLNA is relatively a very new thing. A lot of the features or capabilities are not really stable yet.

serviio

Anyway, due to the limitation of miniDLNA, I eventually stumble upon Serviio. This is a free (at this moment of writing) DLNA server that support all sorts of platforms including Windows, Macs and most importantly Linux. Naturally, this is best to be tested by my HP Proliant Micro Server running Ubuntu 10.04LTS. What I like about Serviio is it’s community in the forum which is a bunch of helpful chaps. The developer is always around to answer questions and help with technical support. There are also a bunch of customization which is posted for different brand of TVs. That is because different manufacturer implement DLNA differently. So customization for TV is very important to allow transcoding and subtitles to work properly.

The latest version of Serviio is 0.6 which was released yesterday. So it’s totally hot off the SVN and it improve a lot from previous version. Serviio is Java based application and it depends a lot on FFmpeg for it’s transcoding capability. I will post more about Serviio 0.6 after I am done upgrading from my old version of 0.5.2 and tested it out.

Edit:

As promised, here are some more information on Serviio 0.6. Heck I will throw in my upgrade experience from 0.5.2 to 0.6 for those having problems to reference.

This is for Ubuntu 10.04LTS and it might work on other linux distro but proceed at your own risk. And assuming you have already installed 0.5.2 previously and already have all the necessary packages needed to compile FFmpeg.

  1. Download Serviio 0.6, FFmpeg and libRTMP 2.4 from official website

[cc lang=”bash” escaped=”true” width=”100%” lines=”5″]
wget http://download.serviio.org/releases/serviio-0.6-linux.tar.gz
wget http://download.serviio.org/opensource/ffmpeg-8bc3a4807e2da36f458e7784c3d390dbd19899a5.tar.gz
wget http://download.serviio.org/opensource/rtmpdump-c58cfb3e9208c6e6bc1aa18f1b1d650d799084e5.tar.gz
[/cc]

  1. Untar all the files:

[cc lang=”bash” escaped=”true” width=”100%” lines=”5″]
tar -zxvf rtmpdump-c58cfb3e9208c6e6bc1aa18f1b1d650d799084e5.tar.gz
tar -zxvf ffmpeg-8bc3a4807e2da36f458e7784c3d390dbd19899a5.tar.gz
tar -zxvf serviio-0.6-linux.tar.gz
[/cc]

  1. Compile rmptdump:

[cc lang=”bash” escaped=”true” width=”100%” lines=”5″]
cd rtmpdump
make
make install
[/cc]

  1. Compile FFmpeg:

[cc lang=”bash” escaped=”true” width=”100%”]
cd ../ffmpeg
./configure –enable-static –disable-shared –bindir=/tmp/ffmpeg –disable-ffplay –disable-ffserver –enable-libmp3lame –enable-pthreads –disable-mmx –extra-ldflags=-L/tmp/static/lib –extra-cflags=-I/tmp/static/include
make
make install

##check to confirm ffmpeg has been updated
ffmpeg -version
ffmpeg version 0.8.git, Copyright (c) 2000-2011 the FFmpeg developers
built on Sep 24 2011 01:49:45 with gcc 4.4.3
configuration: –enable-static –disable-shared –bindir=/tmp/ffmpeg
–disable-ffplay –disable-ffserver –enable-libmp3lame –enable-pthreads
–disable-mmx –extra-ldflags=-L/tmp/static/lib –extra-cflags=-I/tmp/static/include
libavutil    51. 11. 0 / 51. 11. 0
libavcodec   53.  8. 0 / 53.  8. 0
libavformat  53.  6. 0 / 53.  6. 0
libavdevice  53.  2. 0 / 53.  2. 0
libavfilter   2. 25. 0 /  2. 25. 0
libswscale    2.  0. 0 /  2.  0. 0
ffmpeg 0.8.git
libavutil    51. 11. 0 / 51. 11. 0
libavcodec   53.  8. 0 / 53.  8. 0
libavformat  53.  6. 0 / 53.  6. 0
libavdevice  53.  2. 0 / 53.  2. 0
libavfilter   2. 25. 0 /  2. 25. 0
libswscale    2.  0. 0 /  2.  0. 0

## if the version is still showing old one then just copy the ffmpeg file directly and replace existing file
##shows the location of current ffmpeg binary
which ffmpeg
/usr/local/bin/ffmpeg

##assuming your current path is in the compiled ffmpeg source which you should
##be assuming you have been following this guide from start
cp ffmpeg /usr/local/bin/ffmpeg
[/cc]

  1. Untar serviio 0.6 and replace existing version:

[cc lang=”bash” escaped=”true” width=”100%”]
cd ..
## rename old version just in-case need to restore back
mv /usr/share/serviio /usr/share/serviio.old
mv serviio-0.6 /usr/share/serviio
[/cc]

  1. Restart serviio and it should work (assuming you have the ubuntu restart script installed):

[cc lang=”bash” escaped=”true” width=”100%”]
/etc/init.d/serviio restart
[/cc]

Once you have done with the restart and assuming it works, you will need to open serviio console and reconfigure your settings again. I prefer it this way as it will make sure the configs are fresh and does not get any incompatible stuffs from previous version.


HP announces upgraded ProLiant MicroServer models powered by AMD Turion N40L

HP Proliant MicroServer

Good news to those interested to join the gang with HP MicroServer. HP is launching updated version of this popular server with basically upgrades in hardware inside the server. The outer look will still be maintained which is fine with me as I am very satisfied with the current design.

CPU has been upgraded from the original 1.3GHz dual-core AMD AthlonII NEO N36L to a newer and better performing 1.5GHz dual-core AMD TurionII NEO N40L.

Models Processor Memory Harddisk Drive Others
Old HP MicroServer 1.3GHz dual-core AMD AthlonII NEO N36L  1GB ECC DDR3 1 x 160GB
New Entry Level 1.5GHz dual-core AMD TurionII NEO N40L  2GB ECC DDR3 1 x 250GB
New Higher Level 1.5GHz dual-core AMD TurionII NEO N40L  4GB ECC DDR3 2 x 500GB SBS 2001 Essentials

Reference: http://h10010.www1.hp.com/wwpc/uk/en/sm/WF06b/15351-15351-4237916-4237917-4237917-4248009-5163346.html


Bash scripting: check process running or not and kill it after certain time

I have been doing some bash scripting lately. I am totally noob when it comes to scripting due to the fact that I kinda hate programming. Still, anyone who have done Linux bash scripting would know it is very powerful. Especially if you know awk, it will be even more terror. Anyway, I have this problem that I needed to solve lately. So here goes:

I have a script that run lots rsync commands. I need to be able to kill the rsync command if it take too long to run. Most of the time this is due to the target server have slow connection issue.

To kill the rsync command, I need to know what process ID (PID) it is running as. In bash, you can find out the PID of current running script by adding this command below into the script:

test1.sh
#!/bin/bash
PID=$$
echo $PID

root:# ./test1.sh
21334

There is one problem with this script above (at least for my application). That is actually NOT what I need. Like I said, the script is running rsync command. The command above specifically $$ will only list out the PID of current script. However, I need the PID of the rsync command. Look at example below:

test2.sh 
#!/bin/bash
rsync --dry-run /home /tmp
PID=$$
echo $PID

root:# ./test2.sh
21355

The output process above actually show the PID of test2.sh script. However, I need the PID of the child of this script which is the rsync command. I have searched around and so far the only command that I found to do it is using $!. This $! will actually get the PID of the process that just got send into the background. Therefore we need to send the rsync script into the background.

test3.sh
#!/bin/bash
PID=$$
rsync -a --dry-run /home /tmp &
PID2=$!
echo $PID
echo $PID2

root:# ./test3.sh
21366
21367

There! You can see $PID2 is now showing the PID of the rsync process. If your /home folder have lots of files then the script will take some time to run. Even though you will get your prompt back, the rsync is still running in the background. In that case, you can open another ssh session and use command “ps aux | grep rsync” to see the actual PID which should tally with the value of $PID2.

Now let’s make things more complicated.

Why do I need this $PID2? That is so that I can kill the process if it is taking too long to run. I do not want this one process to be hogging the line while another rsync is waiting next in line to run.

Here is the last part of the script with everything thrown in:

test4.sh
#!/bin/bash
rsync -a --dry-run /home /tmp &
PID2=$!
count=0
waittime=30
while kill -0 $PID2 2> /dev/null
do
sleep 1
((count++))
if [ $count -gt $waittime ] ; then
kill -TERM $PID2 2> /dev/null
break
fi
done
wait ${PID2}
rcode=$?
echo "Return code is $rcode"

Ok, this is the complete code. Run it and see what happens. Play with the settings and see what happens. Again, the rsync will be send to the background however, a counter will be counting and waiting for the script to complete. Hence the do-loop. In the loop, there is a sleep command which actually need to wait 1 second each time. The loop will exit when either the process is complete or the waittime expired. In our example, process will be killed after 30 seconds of waiting. Once the loop is done, notice the command “wait ${PID2}”. This is needed to get the return code for the process ID. In my script, this return code will notify me if the rsync did not complete with success.

I like this script because it is smart and make good use of the limitation of bash scripting. It send the process to the background purely for getting it’s PID using $! command. There are other ways to get this but so far, I like this one as it is simple to understand and deals with less commands like awk and so on.

Reference: http://www.unix.com/shell-programming-scripting/20412-check-if-job-still-alive-killing-after-certain-walltime.html



Linux 64bit or 32bit?

Linux 64bit has been around for a very long time. It came out long before Microsoft released 64bit version of their Windows. During that time, AMD was the leader in 64 bit processor, thus it was called AMD64 during that time. The name has stuck on until now that some linux packages are using it in their package for 64bit packages. So don’t be puzzled that you will be installing package_AMD64.deb for your debian or ubuntu.

It was a no brainier for me by now that Linux 64bit platform has been established to be very stable and almost all packages comes with it. Being 64bit means your software will have double the bandwidth for memory usage and even processing power. However, in real work, performance is not that much faster. But the main thing about 64bit platform is to break the memory limitation of 32bit which is 4GB RAM.

I would go for 64bit for almost all my OS installation nowadays be it Linux or Windows. However, something happened the other day that made me re-think my perception on 64bit.

Recently, there was a server that I manage that needed to be upgraded. We bought a new server with better hardware specs but it remained 4GB RAM. At that time, we do not think there is a need to add more RAM and server RAMs are not cheap either. As usual, I went for Ubuntu 10.04 LTS 64bit as this is currently our choice of Linux for our servers. After taking few days to painstakingly install, configure and migrated the data from old server, the new server went online. However, the following day we encountered some problems. The server were slow and at one time it even crashed. We had to reboot it and as usual, we check the logs for any signs of problems. Eventually, we found that the APC (Alternative PHP cache) that is running in the server keep churning out error that it does not have enough memory. So we knew this has something to do with memory. We tinkered with APC and the server became more stable and load is now running pretty low at less than 2.0.

However, there are still some problem which we are still not sure whether it is APC or other components like Apache2 or PHP. Eventhough the configurations are all same from old server, the software version is running at newer version. Another few days are spend checking and tinkering until we found the “blank page” problem is due to APC. However, disabling APC does not solve the problem permanently as it will cause the server to run higher load and more memory.

After even more checking, we found that each PHP session is using double the amount of memory from previous server. This eventually lead to the culprit…..64bit. Apparently when you go to 64bit, the amount of memory used will almost become double. That is for certain applications only. In our case, PHP is using double the memory to run the usual stuff but to give 64bit some credit, it does run a bit faster than before. Maybe it is due to the wider 64bit memory bandwidth or maybe it is due to the better processor.

In any case, this taught me a lesson on 32bit vs 64bit.

  1. Use 32bit – if you do not plan to upgrade your system RAM to more than 4GB.**
  2. Use 64bit – if you will definitely be using more than 4GB memory now or later in upgrade. To take advantage of 64bit processor and hardware.

**Bear in mind that it is possible to have 32bit Linux with more than 4GB RAM. Linux 32bit kernel will still be able to make use of all the memory using something called PAE (Physical Address Extension). However, this method does have some limitations. **



Pages:123