zawerf 12 days ago [-]
The most practical use not mentioned here is probably to import existing trained models/weights. I can see it being useful for anything that you want to run in real-time (e.g., webcams apps like https://github.com/ModelDepot/tfjs-yolo-tiny) and can't pay a round-trip cost to server.

https://js.tensorflow.org/tutorials/import-keras.html

Training a model in the browser is the least practical use for tensorflow.js IMO (unless maybe you want to hijack people's browsers to help with training or something).

rasmi 12 days ago [-]
And if you want to just load generic pre-trained classifiers for a variety of tasks, ml5js (which uses TensorFlow.js under the hood) seems promising!

https://ml5js.org/docs/quick-start

jorgemf 12 days ago [-]
> Training a model in the browser is the least practical use for tensorflow.js

Unless you want to do some type of federated training in real time without sending private information to the servers (very unusual case in my opinion)

asual 12 days ago [-]
A nice introductory presentation about deep learning using tensorflow.js: https://youtu.be/SV-cgdobtTA
plurgid 12 days ago [-]
wow! that is an amazingly well done video!
russh 12 days ago [-]
Interesting video except I found the vocal fry to be distracting.
augbog 12 days ago [-]
thanks for sharing!! :)
galfarragem 12 days ago [-]
Noob question: Can anybody tell in a few sentences of plain english what is tensorflow, how it works and why seems to be so relevant?
quadrature 12 days ago [-]
A lot of problems boil down to optimizing a cost function. This function could be the error in recognizing cat pictures or the likelihood of recommending a movie to a user.

What people want is an easy way to express these functions so that a computer can optimize it for them. Tensorflow allows people to do just that, it lets you represent your mathematical equation in a way that can be analyzed and optimized by a computer.

It does it by exposing various mathematical operations that it understands[1]. As long as you can express your mathematical function using these operations then Tensorflow knows how to compute it efficiently and you can use its optimizer to find the optimal inputs.

comes in handy for a lot of problems.

[1] https://www.tensorflow.org/api_guides/python/math_ops

s-macke 12 days ago [-]
Tensorflow tries to fit the free parameters (usually millions of parameters) of a function y=f(x). The fitting algorithm gets usually thousands or millions of examples of how the output y for a given input x has to look like.

For example, x can be tens of thousands of images of cats and dogs, and y can be 1 for a dog and 0 for a cat. The goal for the fitting algorithm is to find parameters that describe the concept of a cat and a dog so that it can can generalize and categorize general images of cats and dogs. A bad fit would be if the network just memorized the example images.

galfarragem 12 days ago [-]
If I understand it correctly:

- Tensorflow is the best tool to make the grunt work necessary to calibrate the parameters of fitting algorithms.

- Models of fitting algorithms are human proposed and where the ML 'art' is. Seems to be kind of reverse engineering.

halflings 12 days ago [-]
Accurate, although in most cases, the art is not in choosing the algorithm (often a regular feed-forward, covolutional or recurrent neural network) but the hyperparameters (regularization, number of layers, units, etc.) and sometimes the way the data is fed (augment the dataset with distortions? sample negative examples randomly? etc.)
gaius 12 days ago [-]
Tensorflow is the best tool

It is one of many competing tools, which is “best” depends ironically on many variables. CNTK is a rival for example.

12 days ago [-]
timr 12 days ago [-]
It's a high-performance-computing framework that allows you to specify matrix and vector math expressions in terms of a graph structure. In exchange, you get automatic derivative calculation, and a somewhat easier ability to distribute complicated calculations across processors.

HN people are mainly interested in it because it's one of the major frameworks for creating neural networks. Also because Google.

The framework is written in C++ but has decent bindings to Python, and is popular in both languages. Aside from wanting to use Javascript, there are no good technical reasons to do any of this stuff in Javascript.

nsthorat 12 days ago [-]
There are many reasons to do it in JavaScript:

- Many companies and projects have their entire server-side stack in JavaScript and Node.js, and often they want to simply make a prediction through a model. It's quite a lot to ask them to pull in a python runtime just to make a prediction. TensorFlow.js with node bindings to TensorFlow C enables this type of inference with minimal overhead.

- Privacy. You can make predictions locally, or send embeddings back to a server without the raw data ever leaving a client.

- Flexibility of JavaScript / TypeScript. Dynamic languages are great for scientific computing, TypeScript allows you to define your own level of type safety, from raw JS on one end, to strict typing support on the other end.

- Interactivity / education tooling. See tensorflow playground for an excellent example.

- No servers for applications. Making predictions in TensorFlow on a server can be expensive in the long run. Hosting static weights on a server is much much cheaper.

JavaScript and Python ecosystems for machine learning are not mutually exclusive -- they both have their strengths and weaknesses.

timr 12 days ago [-]
There's literally only one reason to do it in Javascript: you want to use Javascript. There are dozens of reasons why it's a terrible idea: unfortunate memory consumption, abysmal performance, poor abstractions, bad library support, and so on. Tensorflow in Python isn't exactly a stellar choice for performance, but at least you gain flexibility and nice abstractions and good high-performance math/stats library support. Go with JS, and you're getting none of that.

"Many companies and projects have their entire server-side stack in JavaScript and Node.js, and often they want to simply make a prediction through a model."

Right. So this is "we don't want to use another language". Acknowledged.

"Privacy. You can make predictions locally, or send embeddings back to a server without the raw data ever leaving a client."

If calling out to a binary is a security problem for you, you have bigger problems than choice of language. Also, of course, you don't need tensorflow to convert your top-secret data into an input vector that you can send somewhere (seriously: it does not help with this problem).

Your third and fourth points -- flexibility and interactivity -- are indeed why people use Python vs C++ (even though it's more difficult and painful to get decent performance out of TF with that approach). So again, this boils down to "I don't want to use Python and I'd prefer to use JS instead."

"No servers for applications. Making predictions in TensorFlow on a server can be expensive in the long run. Hosting static weights on a server is much much cheaper."

You're contradicting yourself with this point. Servers are expensive so hosting static weights on a server is cheaper? I have no idea what this means.

williamdclt 12 days ago [-]
> So this is "we don't want to use another language".

Yup, which is a perfectly valid reason.

> If calling out to a binary is a security problem for you

You miss the point. The security problem is sending the raw data from the client to the server.

> So again, this boils down to "I don't want to use Python and I'd prefer to use JS instead."

Which once again, is a perfectly valid reason

> Servers are expensive so hosting static weights on a server is cheaper?

No, CPU/GPU intensive tasks on a server is expensive but storing a few static weights is cheap.

timr 12 days ago [-]
"You miss the point. The security problem is sending the raw data from the client to the server."

So don't do that. If you can run tensorflow on your device, you can call out to a local process.

If you want to use JS to do everything, fine. But that's not a good reason. It's just a reason.

zeroxfe 12 days ago [-]
> So don't do that. If you can run tensorflow on your device, you can call out to a local process.

Not from a webapp (without jumping through a dozen other hoops.) With tensorflow.js, you can do (for example) pose estimation, or face detection, or audio recognition, right in the browser without sending data to a remote server.

> But that's not a good reason. It's just a reason.

Yes, of course it's a reason. The point is that it can be a good reason in many cases.

12 days ago [-]
timr 12 days ago [-]
So now we're down to "I want to run a neural network exclusively in the browser" as the primary reason you'd want to use this.

OK, fine. That's a niche use-case for a domain where scale and performance matter so much that we're building specialized hardware to support it. For 99.99% of developers, they would be better advised to find another way to solve their problem using more conventional tools.

There was a guy who built a life-sized house out of Lego once. It was a cool trick, but the difference between him and "modern Javascript" developers is that he didn't try to make anyone live in the house.

halflings 12 days ago [-]
How can you run Python tensorflow locally? We're talking about web apps here.

Try to understand what is at play here. It's not all about raw performance, there are many more important things to consider that GP explained extensively.

12 days ago [-]
quadrature 12 days ago [-]
I think you're conflating privacy and security. Serving the model on the client side means that the server is never aware of who put the inputs in.

Also he isn't contradicting himself, serving static data off a CDN or server is a lot easier and a lot less expensive than having a cluster setup to do inference.

12 days ago [-]
12 days ago [-]
bitL 12 days ago [-]
Is there a TF/Keras in Python to JavaScript transpiler somewhere? I don't want to waste time retyping complex methods to JS. Thanks for any suggestion!
halflings 12 days ago [-]
You can probably export your Keras model, load it in tensorflow.js, and train it. Makes more sense than transpiling.

Guide here:

https://js.tensorflow.org/tutorials/import-keras.html

bitL 12 days ago [-]
Thanks! Do you by chance know if it is practical to convert huge .h5 Keras models to TF.js layers model? (~1GB). I have some state-of-art computer vision models and it would be great if they could be used in a browser with a WebCam, if they can fit into memory and be performant for inferencing.
zawerf 11 days ago [-]
Mobile has similar constraints as web (but to a lesser extent) so you will find more resources there.

Look into quantizing the weights (e.g., reduce from 32bit per parameter to 8 bits) to reduce model size: https://www.tensorflow.org/mobile/optimizing#model_size

Micoloth 12 days ago [-]
Nice introduction!

I think your step 4 code is messed up though?

jaya-yellowant 12 days ago [-]
Fixed it, thanks!
burkel24 12 days ago [-]
You were so busy figuring out if you could, you didn't stop to wonder if you should…

Joking aside this is super cool

gaius 12 days ago [-]
I was going to upvote you ‘til I read the last line
brootstrap 12 days ago [-]
haha take my upvote sir. i'm looking forward to new breed of websites that abuse our browsers unknowingly. First mining crypto, now we will have people using thousands(millions) of web browsers as part of some elaborate compute cluster that runs some 'AI' deep learning crap to try and optimize branding/click rates etc.