pfh blogs

homeblogtwitterthingiverse



~ 1% inspiration, 99% perseveration ~


18 October 2016, 5:53 UTCShiny interactivity with grid graphics

This post may be of interest to a very small number of people. Using a fancy grid-based graphics library in R such as lattice and want some Shiny brushing goodness? Sadly, Shiny only supports base graphics and ggplot2.

Solution: First work out the name of the relevant viewport (grid.ls(viewports=T,grobs=F)), then use the gridBase package to set up a corresponding base graphics viewport.

output$plot <- renderPlot({

    # ... draw some grid graphics ...

    seekViewport("...someviewport...")
    par(new=T, plt=gridBase::gridPLT())
    plot(1, type="n", axes=F, 
         xlab="", ylab="", 
         xlim=c(0,1), ylim=c(0,1),
         xaxs="i", yaxs="i")
})

If the viewport isn't named, tell the developer of the package to name their viewports.

[permalink]


7 August 2016, 2:17 UTCCrash course in R a la 2016 with a biological flavour

Monash Bioinformatics Platform gave this day long course on Friday 5 July, 2016 (following an introductory course on Thursday). The course style is inspired by Software Carpentry, but currently sacrificing some interaction to get where I wanted to go, which is someone able to do useful bioinformatics work at a grad-student level.

We definitely need to work on getting more interaction going, everyone was too quiet. This may have just been an effect of packing in so much material. However, plenty of solutions to challenges via the chat system, fairly sure most people were more or less following. The anonmymity of chat introduces an interesting dynamic -- students can propose solutions without worrying about defending their ego or propriety, or worrying about stereotype threat.

[permalink]


17 July 2016, 1:45 UTCVectors are enough
Numpy goes to some lengths to support multi-dimensional arrays, "tensors". This isn't necessary, vectors are enough.

R has a different solution. A data frame (similar to a table in a database) with column vectors for the dimensions and for the value stores the same information, and a data frame can store many other things besides. Sparse tensors are easy, for example.

As Judea Pearl might say, statistics is not about causality. A tensor implies causality, the dimensions of the tensor cause the values it holds. R, being a statistical language, rightly does not require causality to be specified and gains flexibility. Or we might say that the data frame approach does not require the primary key columns to be made explicit. So for example a bijective map need not privilege one direction of mapping (as a Python dict would).

A data frame dense in some of its columns (eg from a full factorial experiment) can be stored more compactly as a tensor, but this is a detail of implementation that could be hidden (eg in a new flavour of tibble). Similarly, one might want various indexes to be maintained alongside a data frame.

[permalink]


28 May 2016, 22:04 UTCSci-Hub over JSTOR

I use Sci-Hub as a source for academic literature.

Sci-Hub has the massive advantage over JSTOR that they don't claim to own the work of millions and academics who worked for the general betterment of humanity and then hound people who challenge this to suicide. For me this is a compelling advantage.

A basic process of science, publication of results, has become an exercise in restricting access to knowledge for profit. Sci-Hub is the only way most people have of accessing much of the academic literature without encountering endless punitive tolls.

[permalink]


4 November 2015, 23:41 UTCComposable Shiny apps

Update April 2016: RStudio now has an official way to write composable shiny "modules".



Shiny is a library for writing web applications in R. If you use R, I highly recommend it, it's very clever. If you haven't used R and Shiny, this blog post is going to be pretty incomprehensible.

I wanted a way to compose Shiny apps. I haven't found an official way to do this, so this post details my solution. Maybe some day the clever people at RStudio will release an official way that makes this all moot.


You may want to review the tutorial on the various ways a Shiny app may be constructed and run. I will be constructing Shiny apps using the shinyApp function. This can either be done interactively, or from a file "app.R" in a Shiny app directory. Working interactively, an app is launched when it is print()ed, similar to ggplot. So, working interactively, calling a function to create a shiny app will launch it immediately (unless you save it to a variable). Only in R!

You may also want to review how reactive expressions in Shiny work. Recall that it is the job of the server function in a Shiny app to create all the reactive expressions



My code can be found in my Varistran package, in particular the file shiny_util.R.

What I call a "composable Shiny app" is a function in R returning a Shiny app object (so it can be used in the normal ways), but with that app object having some extra fields, $component_ui containing the UI part of the app and $component_server containing the server function.

varistran::shiny_plot is an example of a composable Shiny app. Such a function constructs ui and server and then calls varistran::composable_shiny_app(ui,server) to create the app.

Considerations when writing a composable Shiny app:

Example

This is all a bit confusing. Reading over the varistran::shiny_plot function should give you a better idea of how it all fits together. You will note in this function:



Example usage:

install.packages(c("devtools", "shiny"))
devtools::install_github("MonashBioinformaticsPlatform/varistran")

library(shiny)

plot_app <- varistran::shiny_plot(
    function(env) {
        plot((1:10)**env$input$power)
    },
    prefix="myplot"
)

ui <- fluidPage(
    titlePanel("Composable app demo"),
    numericInput("power", "Power", 2),
    plot_app$component_ui
)

server <- function(env) {
    plot_app$component_server(env)
}

varistran::composable_shiny_app(ui, server)

Here is the result.

The plot component is able to react to changes from outside, namely the value of input$power.

[permalink]


14 September 2015, 4:39 UTCRecorder technique and divisions in the 16th century

I gave a talk on the book Fontegara (1535) by Sylvestro Ganassi at this year's St. Vitas Dance & Music weekend. This book is a manual on recorder technique and making divisions (improvisation on a ground). Here are my class notes:

The actual book and its translation is available online from IMSLP, and is quite readable if you are into that sort of thing.

[permalink]


30 July 2015, 0:04 UTCLinear models, a practical introduction in R

These are slides to a talk I gave introducing linear models and nested model hypothesis testing in R. These are the basis of many common statistical tests.

[permalink]


7 May 2015, 5:47 UTCWhen I was a young lad in the '90s

When I was a young lad in the '90s, Bill Gates was a bad guy. Turns out he just hadn't gotten to the "and give to the poor" bit. Now Google and Apple are battling for OS dominance and the cloud runs Linux.

When I was a young lad in the '90s, geeks and nerds were oppressed and didn't form roving online gangs.

When I was a young lad in the '90s, free software was going to change the world. That kinda half worked.

When I was a young lad in the '90s, the internet was going to facilitate new systems for creating intellectual property without resorting to copyright. Whuffie would be the new unit of currency. Kickstarter is pretty neat. Reddit scares me a little. Anita Sarkeesian had a pretty successful kickstarter, and then so did her opponents.

When I was a young lad in the '90s, blogging was going to change everything. We streamlined it and got facebook and tumblr and memes. The past is now unimaginable.

When I was a young lad in the '90s, the internet offered wonderful new ways of making money with very little startup cost, offering the chance to spread wealth down the long tail. Now agile startups trash whole industries overnight. Etsy has some neat stuff on it though.

When I was a young lad in the '90s, GUIs were making computers more than just a niche interest. Now everyone's learning to code.

Victory, terrible victory, and I'm so lost.

[permalink]


3 November 2014, 11:41 UTCVirtualenv Python+R

The following creates a Python virtualenv that can also hold R libraries.

    virtualenv venv

    mkdir venv/R
    
    echo 'export R_LIBS=$VIRTUAL_ENV/R' >>venv/bin/activate
    for LIB in venv/lib/python*
    do
        echo 'import os,sys; os.environ["R_LIBS"]=sys.prefix+"/R"' >$LIB/sitecustomize.py
    done

    echo ". `pwd`/venv/bin/activate && `which R` \$@" >venv/bin/R
    echo ". `pwd`/venv/bin/activate && `which Rscript` \$@" >venv/bin/Rscript
    chmod a+x venv/bin/R venv/bin/Rscript

Python accessed from R using the R package "rPython" will correctly use Python packages installed in the virtualenv. (Note: As at 2014-12-05, rPython is pretty dodgy, floats are only sent accurate to 5 digits by default and strings not correctly quoted. May also need to use jsonlite, and call jsonlite::toJSON with digits=50.)

R accessed from Python with "Rpy2" etc will correctly use R packages installed in the virtualenv.


See also Packrat for R.

[permalink]


24 October 2014, 22:08 UTCSexism: spreading from computer science to biology
Working occasionally with two junior researchers, one female, one male. Wonder if I'm being fair. Both wet/dry, both learning a bit of R+, excellent combo to be developing skills in. Woman is shy, hesitant, but putting in quite a lot of work. Man is confident despite a bad honours year, and taller than me, solidly built. Shows me his code, impressive for someone new to programming, loops and functions and so on. Maybe he has some latent skills from growing up with computers, and I'm seeing his current level rather than rate of growth? Is there some problem in my own manner?

Wednesday I spent time in a meeting pulling apart the woman's statistics; p=0.14, not correcting for multiple testing, why are you pursuing this? She's trying to make the best of poor data, with considerable stubbornness, in the sort of deathmarch project that backs up one under-powered experiment with another under-powered experiment, working with people who think that if you have a time series you don't need replicates, and expect a meaningful comparison between in-vivo brain tissue with a brain-cancer cell line.

Ugh.

[permalink]



21 August 2014, 6:03 UTCFirst-past-the-post voting outcomes tend to surprise the candidates
21 August 2014, 2:59 UTCDates in Google Search aren't trustworthy
27 June 2014, 2:22 UTCReading "Practical Foundations of Mathematics"
18 May 2014, 10:34 UTCCellular automaton tiles revisited
7 April 2014, 8:27 UTCSelfish sweep
3 April 2014, 2:40 UTCBagpipes kickstarter
1 March 2014, 5:36 UTCTabor pipes on thingiverse
14 February 2014, 7:19 UTCDemakein: introducing --tweak-emission
12 January 2014, 21:56 UTCAngry White Men
15 December 2013, 10:09 UTCBreathing
11 November 2013, 8:53 UTCReductionism meets Buddhism
9 November 2013, 1:36 UTCDemakein 0.12: more example scripts
25 October 2013, 1:01 UTCWe haven't won, we just got enough power to censor them
28 July 2013, 5:45 UTCMental clock games
11 June 2013, 9:46 UTCHumanity has declined
1 June 2013, 1:40 UTCProgrammer nature
16 April 2013, 9:55 UTCAcetone vapour [detonation] chamber for ABS plastic smoothing
10 April 2013, 8:06 UTCDigital devices for the punk
5 March 2013, 1:39 UTCAlternative architecture: Giant roundhouse
15 February 2013, 13:15 UTCRockstar job market

All older entries

Google
Web www.logarithmic.net



[atom feed]  
[æ]