How to get rotation working


I had .jpeg like Artifacts in mind at first but if the loss is really that small, it should not be a problem. Tearing however is a problem so let's hope it will be really fixed.
 
There are pretty good lossless compressions out there, I would guess the chip just doesn't support it.
Lossless compression has variable bandwidth, possibly as much as 100% of original, they needed an algorithm that would guarantee to be below a certain amount of data.From what I understand it uses a compression which is as lossless as possible unless the image is too busy and it can't be losslessly compressed below their needed bandwidth.
 
The reason for compression in the SSD is limited on-chip memory (having 24 bit for all pixels costs a lot of $$$).

I haven't seen any artifacts by this compression - so I think it is almost loss-less.

Using the OMAP5 2D accelerator for rotation needs quite some DRAM bandwidth and a driver squeezed between X11 the frame buffer and the DSS system.

Alternative would be to use the OMAP5 "Tiler" (but it appears not to be able to handle a standard frame buffer format).

In both cases the SSD chip could be thrown out and replaced by 0R resistors (bridging the signal traces).

So there are many options. Only the one with SSD seems to be the most transparent and the one with the least software issues (we now know how to initialize the chip and then it runs autonomously).

Fixing the tearing issue might be possible by adding a software PLL for the 60Hz VSYNC to the kernel driver.
 
Last edited by a moderator:
Well, if you're interested, I could set up a remote connection for you incl. Webcam or I could send you an EVM or prototype setup.
Currently, the missing piece seems to be the kernel module. It is not present in the current builds of the Pyra OS, or any other OS for that matter. It is probably the "gcbv" from Ti's android kernel 3.8 that needs to be ported/compiled. I do not have any intentions of attempting this myself.

Partial source code for the client libraries can be found at http://git.omapzoom.org/platform/external/bltsville.git/?p=platform/external/bltsville.git;a=summary(among other places probably, this is what I used.) You need to run ldconfig manually after installing it. Some needed binaries are included, not sure if they'll pose problems or not.

As for attempting to test if it works, I tried to use the bvtest program from http://git.ti.com/bvtest

You'll need to add /usr/local/include/bltsville to CFLAGS, and you might want to remove USE_OMAPFB (or whatever it was called) from the makefile.

As for testing it, passed the following parameters. They're nothing special, just enough to get the test program to attempt doing something. Input file is from the official API repo (what you find when you just search for bltsville)

Code:
./bvtest -imp ticpu -src1file ../../../../bltsville/bltsville/blend-854x480-bad.jpg -src1 RGB24 854x480 -dst RGB16 854x480 -dstfile arne.bmp
Error copying source file data to source surface: (0x00000003) REQUIRED RESOURCE UNAVAILABLE.
This will throw an error, that according to the documentation, means that it just failed to initialize. It seems to be looking for /dev/gcioctl, which indeed isn't present. Since I couldn't find a copy of the software implementation, I haven't tried with that, so I don't know if the test program will work at all.

 

And that's my core dump from what I learned trying to getting it working last night :p
 
which is as lossless as possible
There's not such thing as "as lossless as possible" - either it's lossy or it isn't.
It can be lossless on most images and only lossy on pathological examples. If it is only lossy on noise-like images, then who cares?

Imagine something like this:

A 1280x720 raw RGB buffer is 2700 kilobytes.

Imagine you have only 1000 kilobytes available.

You try to make a PNG, and use that if it is less than 1000 kilobytes.

Otherwise you use a JPEG file, with the quality parameter of the lossy compression set as high as possible while still fitting in 1000 kilobytes.

This is not how the rotation chip's compression works, but it's roughly something like that. In many cases it will be lossless: e.g. for desktop applications or web pages that in practice contain lots of areas of one single color or simple gradients. When it's lossy, the loss is most likely to be mostly in the bits that were lost already, e.g. when viewing a JPEG file with high amounts of detail, it will most likely not render all the JPEG artifacts exactly correct ;)

In any case, if there's a way to bypass the rotation chip and get a 720x1280 buffer, then it's straightforward to make an image viewer that is perfectly lossless, for those of us who are paranoid about compression artifacts. For moving images (e.g. in most games), it is very doubtful that anyone would be able to notice any compression artifacts.
 
I have the Solution! just use a Vector-CRT instead of an LCD and simply rotate the deflection Coils so it fits or use a motor to rotate the Coils. That would be true Hardware Based rotation!
 
There's not such thing as "as lossless as possible" - either it's lossy or it isn't.
You grasp of the English language astounds me sometimes.
Lossless :

  • adj.

    Not losing information.
 

This adjective has a binary definition. You are using an adjective that has an absolute value and you are attaching to something that is relative. That just does not make any sense in ANY language. It's like saying "as best as possible" - either it is the best, or it's not.

CQFD.
 
Last edited by a moderator:
For moving images (e.g. in most games), it is very doubtful that anyone would be able to notice any compression artifacts.
I am actually a little worried whether such artefacts will not cause some slight visual issues for 2D games with scrolling elements on high contrast foregrounds/backgrounds - will the artefacts leave a "trail" because of the compression processed in real time?

For 3D games I agree that it should be very much invisible.
 
Last edited by a moderator:
Lossless :

  • adj.

    Not losing information.
This adjective has a binary definition.CQFD.
Yup, that is absolutely correct.
You are using an adjective that has an absolute value and you are attaching to something that is relative. That just does not make any sense in ANY language. It's like saying "as best as possible" - either it is the best, or it's not.
Simply astounding.
 
For moving images (e.g. in most games), it is very doubtful that anyone would be able to notice any compression artifacts.
I am actually a little worried whether such artefacts will not cause some slight visual issues for 2D games with scrolling elements on high contrast foregrounds/backgrounds - will the artefacts leave a "trail" because of the compression processed in real time?

For 3D games I agree that it should be very much invisible.
I think the compression is on a frame-by-frame basis, so in principle there should be no difference between the artifacts in fast moving/scrolling games and the artifacts in still images.

There's not such thing as "as lossless as possible" - either it's lossy or it isn't.
You grasp of the English language astounds me sometimes.
Lossless :

  • adj.

    Not losing information.
 

This adjective has a binary definition. You are using an adjective that has an absolute value and you are attaching to something that is relative. That just does not make any sense in ANY language. It's like saying "as best as possible" - either it is the best, or it's not.

CQFD.
Potential grammatical issues aside, "as lossless as possible" actually does make sense in this context. There are images that will be compressed without loss, and other images that will be compressed with loss. In many practical settings it will actually be lossless.

Other approaches would be "always lossy", like approaches that reduce the memory footprint by lowering the resolution or the bit depth.

E.g. the Pandora in X uses only 16 bpp by default, which means that 1/3rd of the information is actually lost, and the artifacts can be quite noticeable (in particular the banding issue in some color gradients has bothered me). This is way worse than what the rotation chip will typically do.
 
Well, if you're interested, I could set up a remote connection for you incl. Webcam or I could send you an EVM or prototype setup.
Sorry, I can't commit to something like that. There are probably plenty of people here who are as or more qualified than I am who can and want to, maybe a thread should be started asking for this.
 
Fixing the tearing issue might be possible by adding a software PLL for the 60Hz VSYNC to the kernel driver.
Not sure if this was included in your description of PLL, but the chip probably has available timers that can be used for this, I guess the question is if any can be muxed onto the pin you need. Getting it to be roughly synchronized with another signal or the DSS's vsync will be annoying but plausible.

I think the compression is on a frame-by-frame basis, so in principle there should be no difference between the artifacts in fast moving/scrolling games and the artifacts in still images.
The algorithm is almost certainly macro-block based, meaning that a scrolling image can have changing artifacts as pixel groups move between being in the same macro-block and split between multiple macro-blocks.
 
Last edited by a moderator:
I think the compression is on a frame-by-frame basis, so in principle there should be no difference between the artifacts in fast moving/scrolling games and the artifacts in still images.
The algorithm is almost certainly macro-block based, meaning that a scrolling image can have changing artifacts as pixel groups move between being in the same macro-block and split between multiple macro-blocks.

Oh yes, that is very likely. What I meant is that unlike most video codecs, it does AFAIU not matter whether the 60 frames in one second are all the same or all completely different. People might be thinking about fixed-bitrate compressed video streams which tend to show more artifacts in fast-moving scenes, and that is not the type of thing that would happen here.
 
Oh yes, that is very likely. What I meant is that unlike most video codecs, it does AFAIU not matter whether the 60 frames in one second are all the same or all completely different. People might be thinking about fixed-bitrate compressed video streams which tend to show more artifacts in fast-moving scenes, and that is not the type of thing that would happen here.
Yeah, and it's probably true that "leaving a trail" is more something that would happen with temporal compression.

The artifacts here would more likely manifest as shimmering or noise, although it'll probably still be very hard to perceive.
 
Forgive me if I say something particularly stupid, but as I was looking at the pictures provided by ED for the ssd-whatsit test, I was noticing a vertical column, "x" number of pixels wide..."missing?" from the center of all the compressed shots... Is that actually a thing we will have to live with whenever we view something on the Pyra's screen? I mean, with still shots, it's not horrible, but when we start talking about moving imagery (movies, games, whatever), it's going to stick out like a sore thumb..
 
Last edited by a moderator:
Back
Top