<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>MindsEye on Simia Cryptus Software</title>
    <link>http://blog.simiacryptus.com/tags/mindseye/</link>
    <description>Recent content in MindsEye on Simia Cryptus Software</description>
    <generator>Hugo -- gohugo.io</generator>
    <language>en-us</language>
    <lastBuildDate>Sun, 26 Apr 2020 00:00:00 +0000</lastBuildDate>
    
	<atom:link href="http://blog.simiacryptus.com/tags/mindseye/index.xml" rel="self" type="application/rss+xml" />
    
    
    <item>
      <title>DeepArtist.org</title>
      <link>http://blog.simiacryptus.com/projects/deepartist/</link>
      <pubDate>Sun, 26 Apr 2020 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/projects/deepartist/</guid>
      <description>&lt;p&gt;DeepArtist.org is intended to be the umbrella for artistic-themed applications built using MindsEye. This site currently consists of &lt;a href=&#34;http://examples.deepartist.org/&#34;&gt;examples.deepartist.org&lt;/a&gt; which displays example notebooks provided for this project.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/SimiaCryptus/deepartist.org/tree/master&#34;&gt;github://deepartist.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/SimiaCryptus/examples.deepartist.org/tree/master&#34;&gt;github://examples.deepartist.org&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&#34;http://code.simiacrypt.us/release/2.0.0/deepartist/index.html&#34;&gt;Release 2.0.0&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
    </item>
    
    <item>
      <title>MindsEye 2.0</title>
      <link>http://blog.simiacryptus.com/posts/mindseye_2.0/</link>
      <pubDate>Sun, 26 Apr 2020 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/mindseye_2.0/</guid>
      <description>&lt;p&gt;Hello Everybody!&lt;/p&gt;
&lt;p&gt;Today I&amp;rsquo;d like to announce the &lt;a href=&#34;http://code.simiacrypt.us/release/2.0.0/all-projects/&#34;&gt;2.0.0 release&lt;/a&gt; of MindsEye and related projects!&lt;/p&gt;
&lt;p&gt;It&amp;rsquo;s been quite a journey already&amp;hellip;&lt;/p&gt;</description>
    </item>
    
    <item>
      <title>MindsEye 2.0</title>
      <link>http://blog.simiacryptus.com/projects/mindseye/</link>
      <pubDate>Sun, 26 Apr 2020 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/projects/mindseye/</guid>
      <description>&lt;p&gt;MindsEye is an AI framework built using Java. It uses reference counting for efficient resource use, and uses libraries such as CuDNN (CUDA) and Aparapi (OpenCL) to do numerical heavy lifting. It provides a highly customizable optimization library, and a wide variety of pre-coded layers.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/SimiaCryptus/all-projects/tree/master/mindseye&#34;&gt;&lt;strong&gt;GitHub URLs&lt;/strong&gt;&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&#34;http://code.simiacrypt.us/release/2.0.0/mindseye/index.html&#34;&gt;Release 2.0.0&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
    </item>
    
    <item>
      <title>DeepArtist.org Release 1.0</title>
      <link>http://blog.simiacryptus.com/posts/deepartistorg_release_10/</link>
      <pubDate>Sun, 01 Sep 2019 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/deepartistorg_release_10/</guid>
      <description>I&amp;rsquo;m pleased today to announce the release of the Simiacryptus data tools v1.8.0, including the first version of a new image art publishing application named and located at DeepArtist.org - Notably using the subdomain examples.deepartist.org.
What is it? DeepArtist.org is an image processing platform using convolutional neural networks to perform state-of-the-art image processing techniques. This software is targeted at hobbyists and digital artists, and as such this documentation is focused on the practical tools provided to produce pretty pictures.</description>
    </item>
    
    <item>
      <title>SparkBook - A lightweight data science applications platform</title>
      <link>http://blog.simiacryptus.com/posts/sparkbook__a_lightweight_data_science_applications_platform/</link>
      <pubDate>Wed, 26 Sep 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/sparkbook__a_lightweight_data_science_applications_platform/</guid>
      <description></description>
    </item>
    
    <item>
      <title>Partitioned Style Transfer</title>
      <link>http://blog.simiacryptus.com/posts/partitioned_style_transfer/</link>
      <pubDate>Tue, 05 Jun 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/partitioned_style_transfer/</guid>
      <description>Our newest improvements to style transfer have improved results even further, with better quality results that are both artistically more distinctive and more recognizable in content. The first improvement used an additional color transformation step to remove color differences before style transfer, and re-add the difference afterwards using an inverted transformation; without it, the best results would require the style and content inputs have similar color schemes.
The second improvement is much more involved, but essentially involves identifying regions of a given image.</description>
    </item>
    
    <item>
      <title>Easy Deep Learning on AWS with MindsEye</title>
      <link>http://blog.simiacryptus.com/posts/easy_deep_learning_on_aws_with_mindseye/</link>
      <pubDate>Tue, 17 Apr 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/easy_deep_learning_on_aws_with_mindseye/</guid>
      <description>Let’s say you have a local java application you are developing. For some reason, you want to run some code on AWS EC2 - After all, the cloud and virtual computing revolution makes all that theoretically easy. All you need is an AWS account… right?
However, if you are starting from a local Java application and just have the goal “run this code on the cloud”, there are actually quite a few problems to solve before you can do this.</description>
    </item>
    
    <item>
      <title>Texture Generation</title>
      <link>http://blog.simiacryptus.com/posts/texture_generation/</link>
      <pubDate>Mon, 09 Apr 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/texture_generation/</guid>
      <description>One very entertaining application of deep learning is in style modification and pattern enhancement, which has become a popular topic on the internet after Google’s Deep Dream post and subsequent research and publications on style transfer. Reproducing this research has long been a goal for the development of MindsEye, and now that it is achieved I’m having quite a bit of fun with this playground I built! I have collected the interesting visual results of my work in this online album.</description>
    </item>
    
    <item>
      <title>Eye Candy!</title>
      <link>http://blog.simiacryptus.com/posts/eye_candy/</link>
      <pubDate>Sun, 01 Apr 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/eye_candy/</guid>
      <description>Deep Learning has in recent years seen dramatic success in the field of computer vision. Deep convolutional neural networks tens of layers deep are becoming common and are some of the best performers for image recognition. Additionally, these learned networks can be used to produce novel artwork, as seen in recent publications about Deep Dream and Style Transfer. Today we will explore these applications with our own neural network platform, MindsEye.</description>
    </item>
    
    <item>
      <title>The 2D Convolution: A Layer Development Story</title>
      <link>http://blog.simiacryptus.com/posts/the_2d_convolution_a_layer_development_story/</link>
      <pubDate>Thu, 22 Feb 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/the_2d_convolution_a_layer_development_story/</guid>
      <description>Hello! Today we will be discussing many aspects of developing differentiable network layers in MindsEye as we explore the 2d convolution layer and its various implementations. First, for background, see my previous post about Test Driven Development with neural networks. Given these test facilities and perhaps more elemental layers, we need to construct a convolution layer that will work in large modern networks with large images as input.
Our first goal is to code a reference implementation, generally in pure java.</description>
    </item>
    
    <item>
      <title>Java Reference Counting (and side projects)</title>
      <link>http://blog.simiacryptus.com/posts/java_reference_counting_and_side_projects/</link>
      <pubDate>Sat, 17 Feb 2018 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/java_reference_counting_and_side_projects/</guid>
      <description>I’ve recently completed another large update to the MindsEye code, implementing a reference-counting base for many of the core classes. This memory management pattern provides us with much tighter system resource management and dramatically reduces load on JVM’s garbage collector. Memory contention has proven to be a main limiting factor in supporting modern large-scale deep-learning models, so these changes were quite beneficial and I think they suggest why Java has often been less popular in this field: The reliance on mark-sweep memory management in Java is often quite inefficient compared to other models when used on this problem.</description>
    </item>
    
    <item>
      <title>Optimization Research</title>
      <link>http://blog.simiacryptus.com/posts/optimization_research/</link>
      <pubDate>Sat, 23 Dec 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/optimization_research/</guid>
      <description>Now that I’ve cleaned up the testing and documentation of MindsEye, I have been able to re-focus on why I started writing it: Optimization Algorithm Research. In the course of playing with this code I have tried countless ideas, most of which taught me though failure instead of success&amp;hellip; However I do have two ideas, fully implemented and demonstrated in MindsEye, that I’d like to introduce today: Recursive Subspace Optimization allows deep networks to be trained effectively, and Quadratic Quasi-Newton enhances L-BFGS with a quadratic term on the line-search path.</description>
    </item>
    
    <item>
      <title>Test Driven Development for Neural Networks, Part II - AB Testing</title>
      <link>http://blog.simiacryptus.com/posts/test_driven_development_for_neural_networks_part_ii__ab_testing/</link>
      <pubDate>Wed, 13 Dec 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/test_driven_development_for_neural_networks_part_ii__ab_testing/</guid>
      <description>In the last article, we covered a common testing framework for individual components, but we didn’t cover how these networks are actually trained. More specifically, how should we design a test suite to cover something so broad as optimization? A big problem here is that the components are heavily dependent on each other and also vary greatly in function and contract, and so there are few opportunities for generic testing and validation logic.</description>
    </item>
    
    <item>
      <title>Test Driven Development for Neural Networks, Part I - Unit Testing</title>
      <link>http://blog.simiacryptus.com/posts/test_driven_development_for_neural_networks_part_i__unit_testing/</link>
      <pubDate>Mon, 04 Dec 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/test_driven_development_for_neural_networks_part_i__unit_testing/</guid>
      <description>A critical part of any good software is test code. It is an understatement that tests improve quality; they improve the scalability of the entire software development process. Tests let you write more code, faster code, better code. One of the leading testing methodologies is unit testing: the philosophy of breaking down software into individual components and testing each separately. It turns out that a great case study in unit test design also happens to be one of today’s hot tech topics - artificial neural networks.</description>
    </item>
    
    <item>
      <title>GPU-accelerated neural networks with CuDNN</title>
      <link>http://blog.simiacryptus.com/posts/gpuaccelerated_neural_networks_with_cudnn/</link>
      <pubDate>Wed, 02 Aug 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/gpuaccelerated_neural_networks_with_cudnn/</guid>
      <description>A recent project that has huge implications for the field of AI is NVidia’s CuDNN library and related cuda-based libraries. Beyond simply being very useful and enabling hardware accelerated AI with cutting-edge performance, it establishes a common layer of high-performance mathematical primitives that, while using the hardware to its best extent, provides a common api to write software. With my recent addition of CuDNN-based layers, Mindseye should behave comparably with any other state-of-the-art deep learning library.</description>
    </item>
    
    <item>
      <title>What is the value of a human life?</title>
      <link>http://blog.simiacryptus.com/posts/what_is_the_value_of_a_human_life/</link>
      <pubDate>Sun, 21 May 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/what_is_the_value_of_a_human_life/</guid>
      <description>Recent developments in MindsEye have yielded greatly increased speed and scalability of network training. Major improvements to the OpenCL kernels have increased speed in some tests by 50x or more, and data-parallel training has been tested with a Spark cluster. This combination of GPU and cluster computing support should bring MindsEye much closer to the performance and scale of other frameworks, if not in the competitive range! The componentization of the optimization code that I wrote about previously has enabled Spark support to be implemented in only about 100 lines in one self-contained class, a nice result of careful design.</description>
    </item>
    
    <item>
      <title>Autoencoders and Interactive Research Notebooks</title>
      <link>http://blog.simiacryptus.com/posts/autoencoders_and_interactive_research_notebooks/</link>
      <pubDate>Mon, 15 May 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/autoencoders_and_interactive_research_notebooks/</guid>
      <description>Further research and development with MindsEye has produced two new features I would like to discuss today. The first is a working demonstration of a stacked sparse denoising image autoencoder, which is a fundamental tool in any deep learning toolkit. Second, I will introduce a useful tool for producing both static and interactive scientific reports, which I use to produce many of my demonstrations and conduct much of my research.</description>
    </item>
    
    <item>
      <title>A Unified Design Pattern for Continuous Parameter Optimization</title>
      <link>http://blog.simiacryptus.com/posts/a_unified_design_pattern_for_continuous_parameter_optimization/</link>
      <pubDate>Tue, 09 May 2017 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/a_unified_design_pattern_for_continuous_parameter_optimization/</guid>
      <description>Almost two years ago I developed a neural network library called MindsEye, which has largely sat idle since the release of TensorFlow. Recently however I’ve wanted to follow up on research involving neural networks, but I wanted a “pure” java option I could use for research. And so I decided it was time to revive my old project.
In this release, I have reviewed all of the code and made many improvements.</description>
    </item>
    
    <item>
      <title>RE: The anatomy of my pet brain</title>
      <link>http://blog.simiacryptus.com/posts/re_the_anatomy_of_my_pet_brain/</link>
      <pubDate>Sun, 18 Oct 2015 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/re_the_anatomy_of_my_pet_brain/</guid>
      <description>In my last post, I talked about a new project I was working on to explore convolutional neural networks (CNNs). I&amp;rsquo;ve spent much of the time since playing with and iterating on this library, and I wanted to take a moment to share what has been built so far. I&amp;rsquo;ve ended up with a library of 30 network layer types which can be wired in an arbitrary directed acyclic (non-recurrent) graph/network and perform gradient descent training and optimization.</description>
    </item>
    
    <item>
      <title>Fun with Deconvolutions and Convolutional Neural Networks in Java</title>
      <link>http://blog.simiacryptus.com/posts/fun_with_deconvolutions_and_convolutional_neural_networks_in_java/</link>
      <pubDate>Tue, 07 Jul 2015 00:00:00 +0000</pubDate>
      
      <guid>http://blog.simiacryptus.com/posts/fun_with_deconvolutions_and_convolutional_neural_networks_in_java/</guid>
      <description>I&amp;rsquo;ve gotten to an interesting point in my latest project, inspired by Google&amp;rsquo;s fascinating recent work with convolutional neural networks. The project can now apply inverse convolution operations using multiple fitness functions.
I wanted to explore the technology of image processing neural networks from the ground-up, so I started by building the fundamentals of a backpropagation neural network library. Building the basic components and solving the initial problems has been interesting, and surprisingly complex.</description>
    </item>
    
  </channel>
</rss>