1. MindCodec Retweeted

    We are currently researching new methods for constructing variational distributions automatically (beyond the standard mean field approach). If you know some literature about this topic please let us know! (And retweet if you have followers who could know!)

  2. MindCodec Retweeted

    Is there a comprehensive review on the effect of SGD batchsize on generalization?

    I've seen papers arguing that large batch GD does not generalise, and also papers that argue the opposite or say the generalisation gap is not all that bad for large-batch SGD. So which one is it?

  3. MindCodec Retweeted

    A new @artcogsys preprint! We present DeepRF: ultrafast population receptive field (pRF) mapping with .
    With similar performance in a fraction of the time, it enables modeling of more complex pRF models, resolving an important limitation of the conventional method.

  4. MindCodec Retweeted

    Thrilled to announce that I just accepted an offer from the @DondersInst @AI_Radboud! I will join their faculty as Assistant Professor this fall.

    Get in touch if you would like to work with me in the area of (visual) computational neuroscience and machine learning!

  5. MindCodec Retweeted

    Check-out TaylorAlgebra. A small package for symbolic Taylor expansions. We are soon going to support and use it for perturbative machine learning in Brancher. 

  6. MindCodec Retweeted

    We are happy to share Brancher .35 with dedicated support for stochastic processes, infinite models and timeseries analysis. Try this on our new tutorial! 

    If you like our work, please share it and star us on Github! :) 

  7. MindCodec Retweeted

    We are happy to share our latest work on generative modeling. k-GANs is a generalization of the well-known k-means algorithm in which every cluster is represented by a generative adversarial network. The mode collapse phenomenon is greatly reduced! 

  8. MindCodec Retweeted

    Coming soon on Brancher: 1) A dedicated interface for stochastic processes and timeseries analysis, 2) automatic construction of variational distributions.

  9. MindCodec Retweeted

    Check out our new preprint: a large fMRI dataset with ~23 hours of video stimuli (Dr Who)!

  10. MindCodec Retweeted

    🗞 ICYMI (weekly newsletter)

    🛠 New Tools

    1. @OttomaticaLLC slim
    2. @pybrancher
    3. Entropic
    4. NeuronBlocks by @msdev
    5. CJSS by @xsanda

    Subscribe 👉 

  11. MindCodec Retweeted

    Check out our new probabilistic deep learning tutorial: 

    You will learn how to build deep probabilistic models using Brancher and how to integrate existing networks in your Brancher models.

  12. MindCodec Retweeted

    this effect is amazingly strong and continues to intrigue...

  13. MindCodec Retweeted

    Happy to announce that the paper "Computational Resource Demands of a Predictive Bayesian Brain" by @IrisVanRooij and @JohanKwisthout is online now in @CompBrainBeh and freely accessible:  1/7

  14. MindCodec Retweeted

    Check GitHub - ⁦@pybrancher⁩ A user-centered package for differentiable probabilistic inference powered by @PyTorch. Example tutorials: TimeSeries Analysis / autocorrelation, Bayesian inference ... 

  15. MindCodec Retweeted

    1/3: Thank you for your interest and support! We are soon going to share a roadmap that will outline our plans for future developments. Among other things, we are going to support symbolic/analytic computations and discrete latent variables.

  16. MindCodec Retweeted

    We are excited to announce the release of Brancher, a module for deep probabilistic inference powered by @PyTorch: 

    Check out our tutorials in @GoogleColab here: 

  17. Our latest paper on stochastic optimization is out! We use Taylor expansion and a pinch of complex analysis to get low variance gradient estimators for models with either discrete or continuous latent variables. 

  18. MindCodec Retweeted

    highlight (jetlag special): A Contrastive Divergence for Combining Variational Inference and MCMC 

  19. MindCodec Retweeted

    StyleGAN trained on abstract art!

    Some of the training images are zoomed-in photos/digital art, and others are zoomed-out photos including the picture frame - hence the zooming/matting effect. Also, the data (from Flickr) includes some DeepDream samples, which it picked up on.

    Embedded video

    By playing this video you agree to Twitter's use of cookies

    This use may include analytics, personalization, and ads.

    Learn more
  20. MindCodec Retweeted

    Perceptual Straightening of Natural Videos

    a short post on a recent paper from Eero Simoncelli's group.