[Tune] Pass buffer_length as a function argument

Currently, we can only set the buffer length of training results via environment variable, which is usually set on the top of a script.

I think it would be better to also provide a way to set the buffer length through a function argument (e.g., add an argument for tune.run).

For deep learning models, we usually train a model for several steps and then perform validation. Thus, I usually set the buffer length to the period of validation. However, the value of the period can be read from command line parameters and can vary across different applications. It’s a bit inflexible to only set it in a environment variable.

Fair! Could you file an issue on Github so we can track that? (or even do feel free to post a pull-request draft!)