Update: One test I did, I went to the linked web page and actually cut and pasted the 100 Essential lines of R onto my R GUI, and presto, I had a fully commanded SQL database engine running under R control. So, let me once again congratulate these two code marvels, R and SQLite.
I keep complex queries in my personal R module, as constants and as part of R functions. So I have SQL strings, my base R/SQL includes an ultra lite ten line token parser for parameter substitution and execution.100 essentials. R moves strings in and out of SQL using small to medium sized R modules. I can read the SQLite master into an array of strings, then reuse that array in SQL queries as table names. I can read in a vector of text codes, and decompose them in R with string manipulation, then post secondary queries using the the sub strings.
This snippet, cuts out part of the series id and looks up the corresponding text in SQLite.
if(prefix == "CC") {
string <- paste("select * from benefit where code = '", substr(s,5,9),"';",sep="")
result[i] <- qraw(string,db)[1,2];i <- i+1
string <-paste("select * from industryocc where code = '", substr(s,10,15),"';",sep="")
result[i] <- qraw(string,db)[1,2]; i <- i+1
string <- paste("select * from benefit where code = '", substr(s,5,9),"';",sep="")
result[i] <- qraw(string,db)[1,2]; i <- i+1
}
I name tables explicitly in that code, but I also have the tables in array form, obtained with:
t <- qm(db)$V1
Then I can loop on tables, and loop on substring matches.
The 100 Essential R/SQL lines allows macro substitution into forms like:
s <- "select * from @ limit @ offset @;"
I set my db local,
db <- "mydata.db"
then I can:
qraw(varsSql( "MyTable 200 400",s),db)
And so on; free hand mixing of R and SQL.
No comments:
Post a Comment