Categories(s): Non-Fiction, Business & economics, Finance
Tag(s): "quantitative finance" "financial engineering" "mathematical finance" quant "quantitative trading"
Please read update at http:://www.mathfinance.cn
1
Quantitative Finance Collector is simply a record of my financial engineering learning journey as a master in quantitative finance, a PhD candidate in finance and a Quantitative researcher. It is mainly about Quantitative finance codes, methods in mathematical finance focusing on derivative pricing, quantitative trading and quantitative risk management, with most of the entries written at university.
Please read update at http:://www.mathfinance.cn
2
www.feedbooks.com
Food for the mind
Please read update at http:://www.mathfinance.cn
3
Quantitative Finance Collector
Updated: 8-10
Update this newspaper
Please read update at http:://www.mathfinance.cn
1
Quantitative Finance Collector
Handling Large CSV Files in R
A follow-up of my previous post Excellent Free CSV Splitter. I asked a question at LinkedIn about how to handle large CSV files in R / Matlab.
Specifically,
Quotation suppose I have a large CSV file with over 30 million number of rows, both Matlab / R lacks memory when importing the data. Could you share your way to handle this issue? what I am thinking is:
a) split the file into several pieces (free, straightforward but hard to maintain); b) use MS SQL/MySQL (have to learn it, MS SQL isn't free, not straightforward). A useful summary of suggested solution:
1, 1) import the large file via "scan" in R;
2) convert to a data.frame --> to keep data formats
3) use cast --> to group data in the most "square" format as possible, this step involves the Reshape package, a very good one.
2, use the bigmemory package to load the data, so in my case, using read.big.matrix() instead of read.table(). There are several other
interesting