Regression Analysis for profiling L-Store metadata throughput handling: Difference between revisions

From ReddNet
Jump to navigation Jump to search
Line 43: Line 43:
|-
|-
|}
|}
[[Image:Latency.png]]

Revision as of 16:59, 22 June 2006

Objective

As part of some of the initial testing with L-Store these tests are attempting to create a base reference model for meta data handling with L-Store. Specifically we are interested in seeing that as we increase from very small file sizes to large file sizes whether the latency profile is a linear fit or a non-linear fit.

Parameters

1. File Size: 1KB, 500KB, 1MB, 50 MB, 100MB, 150MB, 200MB, 250MB

2. Number of Files: 30 files

3. Number of threads: 10 threads

Results

  • Number of Threads: 10
  • Block Size : 1MB (Each Slice is 1MB)
  • Number of Files : 30
  • Current Status: In progress
  • Time of Completion:


Type of Test Number of files Average File Size (MB) Average Latency (sec) Median Latency (sec) Maximum Latency (sec) Std Deviation on Latency
profile_upload 30 0.001 2.8 2.7 11.0 0.63
profile_upload 30 0.5 3.0 3.0 10.0 0.45
profile_upload 30 1.0 3.5 3.4 4.5 0.22
profile_upload 30 50.0 8.4 8.3 10.0 0.49
profile_upload 30 100.0 13.0 13.0 16.0 0.8
profile_upload 30 150.0 18.0 18.0 19.0 0.87
profile_upload 30 200.0 23.0 23.0 27.0 1.3
profile_upload 30 500.0 54.0 53.0 78.0 4.8
profile_upload 30 250.0 27.0 27.0 47.0 1.2
Error creating thumbnail: Unable to save thumbnail to destination