Intel 320 SSD random write performance

While I like performance provided by PCI-E cards like FusionIO or Virident tachIOn, I am often asked about SATA drives alternatives, as price of PCI-E cards often is barrier, especially for startups. There is wide range of SATA drives on market, and it is hard to pick one, but Intel SSD are probably one of most popular, and I’ve got pair of Intel 320 SSD 160GB to play with it.Probably most interesting characteristic for SSD for me is Random Write throughput in correlation with file size , as it is known that the write throughput declines when you use more space. In this post I will test (using sysbench fileio) single Intel 320 SSD card with different filesize ( from 10 to 140 GiB, with step 10GiB). Filesystem is XFS and IO blocksize is 16KiB.I posted all scripts and results on our Launchpad project where you can find

I used next methodology for testing: format xfs, run 1 hour random write test with measuring throughput each 10 sec.The results are bit tricky to analyze, as throughput performs in this way (for 100GiB filesize)You can see that just after format throughput starts with 80MiB/sec, then drops to 10MiB/sec and after about half of hour stabilizes on 30MiB/sec level.We can build the same graph (time -> throughput) for all filesizes we have:where you can see that throughput drops from 100 MiB/sec for 10GiB file to 15MiB/sec for 140GiB file.For reference I added result from similar benchmark for RAID10 over 8 regular spinning SAS 15K disks, which is around 23MiB/sec.From graph we see that all results are stabilized after 2500 sec, and if we get slice of data after 2500 sec, the summary graph ( size -> throughput) looks like:This graph allows to get idea what is throughput for given filesize much easier.E.g. for 70GiB files, we have 40MiB/sec and for 120GiB file, it is 20MiB/sec.Some conclusions from these results:


  • Intel 320 SSD performance is affected by amount of used space. The more space used – the worse performance

  • Throughput may drop very intensively, e.g. from 10GiB to 20GiB it drops by 20%

  • When you run benchmark on your own, take into account time needed to get stabilized result. It may take over half of hour for some cases

In final I want to give credit to R projects and ggplot2 which are very helpful for graphical analyzing of data.

Share this post

Comment (1)

Leave a Reply