4

I am trying to create a web UI for image processing, with some operations similar to what a site like fotor.com offers. However, I have problems to achieve a similar performance. For example, lets say I uploaded an image on fotor.com of around 3+ MB and performed a basic operation of setting the image brightness to "full". Then the preview image (show canvas) will be rendered immediately, with almost no time lag.

I tried to do the same operation using the popular plugin "commonjs", but it took too long to process the same image, and in some cases it "hangs" the browser.

And I have also tried server-side image processing, using http://imageprocessor.org, as I am working with Asp.Net, but after processing the image on the server, it takes too long to load again on the browser.

So my question is: can someone suggest me an idea how I can achieve previewing of processed image with a minimum lag of time like (fotor.com do)?

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
BJ Patel
  • 151
  • 4
  • I think webgl is what you are looking for https://webglfundamentals.org/webgl/lessons/webgl-image-processing.html – Ewan Jan 19 '18 at 18:42

3 Answers3

7

I cannot and will not recommend any libs or frameworks for this, but if you are willing to implement a system by yourself, here is a simple outline of what might work:

  • send the image to the server and reduce its resolution to something which can be processed faster

  • send the reduced image back to the client; now implement (or find) some client-side java script lib which can process this smaller image quickly and reliably

  • in the client UI, do the processing in real time, and record the operations the user applies to the image (like "brightness increased by 12%", "despeckle operation with parameters XY")

  • when the user is finished, send the commands to the server and apply them to the full-size image at the server side

  • provide the final image for storing (maybe as download at the client side, or for storing at some cloud service)

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
  • should we really expect the server to be more powerful than the client machine? – Ewan Jan 19 '18 at 18:35
  • @Ewan: Using a server extends the range of tools for processing huge pictures. For example, it will allow the OP to use mature image processing tools like ImageMagick, for which no Javascript equivalent exists. But maybe your suggestion of WebGL is all the OP needs, I don't know. – Doc Brown Jan 19 '18 at 22:10
2

If it were me, I would probably implement it locally on the users machine using a GPU-based toolkit like WebGL. This allows you to do the processing very quickly and in parallel. This solves several problems:

  • reduces bandwidth, storage, and power since you aren’t sending images to and from the server
  • increases privacy because the users data never leaves their machine
  • reduces liability because you aren’t storing (even temporarily) illegal material on your servers (copyrighted stuff, anything indecent or exploitative)

I do agree with Doc Brown that working on proxies and working on tiles can improve performance if you do it right.

user1118321
  • 4,969
  • 1
  • 17
  • 25
1

Very similar to Doc Brown's answer:

  • let the user choose the file using dropzonejs or something similar. Do not upload the file to the server yet;
  • create a HTML canvas that holds a smaller version of the image;
  • apply filters and effects on the small canvas using Javascript libraries such as filterous or CamanJS. Real-time, front-end only;
  • also record the operations (at front-end);
  • POST the original image and the list of operations to the server;
  • apply the same operations using the same Javascript code, but at server-side this time. Write a PhantomJS page & script for this matter.