jump to navigation

Attansic L1 giganet driver on ASUS P5B-E (kernel 2.6.22.6) September 12, 2007

Posted by TSAI HONG-BIN in Linux.
add a comment

Guys, you don’t have to manipulate the code anymore.

Plz check the kernel config and find device driver-> network device -> ethernet(1000)-> Attansic L1.. though it’s marked as “experimental”, it works fine on my p5b-e box.

Advertisements

kernel synchronization study note (1) June 20, 2007

Posted by TSAI HONG-BIN in Linux.
add a comment

To understand how linux kernel synchronizes is a huge task… and I finally found out how bad my OS concepts is at reading.

Three main structures that closely relate to kernel synchronization and scheduling: softirq, tasklet and work queue.

Before digging into each structure, we must know that the term “context” is used to describe some binary code runs in CPU. So if you may recall, “context switching” refers to the switch of processes run in CPU. A context runs as a process is called process context, in addition to regular processes, a CPU needs to deal with interrupts (either Hardware or Software). Thus the context runs as an interrupt handler is called interrupt context.

The main difference between process context and interrupt context is that process context can block, while interrupt context can’t.

“deferrable functions” are non-urgent interruptible kernel functions (i.e. softirq and tasklet) that run in interrupt context. Work queue, however, runs in process context.

tasklets are implemented on top of softirqs. the structure of tasklet is defined in ~/linux-src/include/linux/interrupt.h. Besides, the enumeration of softirq signals are defined several lines before tasklet_struct. As the comment says, “For almost all the purposes tasklets are more than enough.” If one needs to write a device driver, tasklets should be sufficient.

Work queue, however, is for creating a process that sleeps if necessary. For example, waits for user data.

Got pppoe working June 5, 2007

Posted by TSAI HONG-BIN in Linux.
add a comment

On Ubuntu! so exciting.

It takes about … almost an hour, and I can hardly tell how it works … it’s working anyway. No wonder normal people feel like to stick to Micro$oft windows, at least it takes less than 5 minutes to get online over PPPOE. Oh! did I mention that my alpha blending effect works on merely 256MB RAM? I can afford the time to set up my system, but not the money. Question, will you spend over $3000 on additional RAM just for “watching windows fading?” How about you save the money and see a 3D cubic desktop swirling?

Anyway, do


#sudo apt-get install pppoe

in advance. Than run


#sudo pppoe-setup

follow the instructions to give your username and password that issued by your ISP. Edit /etc/ppp/peer/dsl-provider and give


user username@sample.net

now run


#sudo pon dsl-provider

and get ready to spin

Problematical ATI driver… June 5, 2007

Posted by TSAI HONG-BIN in Linux.
1 comment so far

Dreaming of fancy desktop but cannot afford Vista and additional 1G ram? Try ubuntu + beryl.

Dreaming of ubuntu + beryl with ATI vga card? ya, keep dreaming…

I don’t want to be hostile, so let’s skip my *beep* experience with AMD China. And jump directly to my laptop with ATI X600. fglrx is a properitory driver on linux platform officially released by AMD. IMHO, just take it and use it as is, don’t ask for more.

My first impression of fglrx was not so bad. The installation is quick and easy, and the aticonfig utility comes along is handy. I got my dual-head settings by simply running “aticonfig –initial=dual-head –screen-layout=right” and it works just fine. Unfortunately, happy story ends here. Oh, well, let me add some notes for those who want to know how to install fglrx manually.

1. download driver from http://ati.amd.com/support/drivers/linux/radeonprevious-linux.html
2. execute it and keep click on Yes/Next
3. go to /lib/modules/fglrx/ and run make_install.sh
4. add this line


DISABLED_MODULES="fglrx"

to /etc/default/linux-restricted-modules-common

5. run depmod and reboot
6. done.

The nightmare starts on my wish of having a fancy desktop, beryl. To support beryl, X-window should start with a 3D engine. As far as i know, AIGLX is the best choice, as long as your VGA driver support.

*BEEP* ATI FGLRX DOES NOT SUPPORT AIGLX!!

Ok, let’s take plan B, XGL. It is a good news that my XGL runs well and I can see windows jumping, swirling over A screen, one screen. Remember? I set up for a dual-head originally. Now the other monitor just got black-out … Maybe AIGLX is a must solution for running dual-head + beryl. I ditched fglrx and asks for open source driver radeon. It’s not so difficult, just replace “fglrx” strings in /etc/X11/xorg.conf with “ati”. This time I see AIGLX enabled and dual-head stays. How wonderful, it just doesn’t support direct rendering on ATI X600…!

*BEEP* open source radeon driver doesn’t support all ATI cards!!

Now I see my dilemma, fancy desktop or dual-head? I picked the later in the end… and had had enough about ATI drivers.

Some notes about video playback on Linux June 4, 2007

Posted by TSAI HONG-BIN in Linux.
add a comment

In addition to original maintenance projects, my responsibility here at the new office increases. The most challenging one is “Video Renderer”, plus, on Linux. As long as I can remember, none of the classes I took in under-grad. and grad. school relates to multimedia. Computer Graphics? no, Data Compression? no. The only thing I know about multimedia, is DRM. No, not “Direct Rendering Management,” but “Digital Right Management.”

Fortunately, we have MPlayer on Linux. MPlayer is a powerful, open source media player. It supports various audio/video codecs and offers various ways of rendering to fulfill compatibility over different platforms and operating systems. Please visit http://www.mplayerhq.hu/ for further information.

So, let’s dig into the video rendering part of MPlayer. By running


#mplayer -vo help

You can see a list of available video rendering “techniques,” say, X11, Xv, OpenGL, FrameBuffer…etc. Without a thorough understanding, I picked Xv (XVideo) to help me do my job. There are some reasons urged me to make this decision, and the most important one is, I found a sample…here, http://bellet.info/XVideo/testxv.c, my implementation is a revision from that. Secondly, XVideo basically relies on X11 system, so, unlike directfb, it has loose dependency on vga driver. One more thing, Xv is an extension from X11 that mainly deals with video plackback. To me, that implies fewer APIs that I have to go through.

Before we start to look into the codes, I feel like to explain the “color spaces” that we use to reproduce image/video on screen. As you may already know, all the colors that Human Visual System recognizes can be composed from three primary colors, Red, Green, Blue. By being straightforward, images and videos can thus be reproduced on screen by mixing these three color signals electronically. And yes, it is how we define colors, by drawing RGB respectively in one pixel and users will perceive a mixed color, in the past. As technology evolves, a new color space called YUV, YCrCb, YPrPb, YIQ … either of above refers to an identical system (let’s use YCrCb), was introduced. The YCrCb system defines instead of each original color but the color difference. “Y” signal stands for Luminance and “CrCb” signal stands for chrominance. Since human eyes are more sensitive to Luminance than Chrominance, there is less CrCb signal that we need to carry to reproduce images in even quality. You may see some remarks like “4:2:2” or “4:2:0”, “4:1:1” on DVD case, they all indicate to the sampling methods. For example, “4:2:0” means that if the original image has four pixels, we sample four Y signals and two Cr (Cb in turn) signals. By applying YCrCb instead of RGB system, we save bandwidth that transmits the visual signals. In other words, with same bandwidth, we can transmit more data that may help reproduce images in a better quality.

MPlayer supports most color spaces as well, and therefore a great portion of it’s video rendering source code deals with such switch-cases. For example, If a video stream is recorded in RGB, the video renderer either converts the stream into other color spaces, or it has to draw it on screen in RGB fashion. For test only, this is not my concern.

Finally, let’s go to the source directly.

required include libraries: (weird, wordpress doesn’t allow me to put arrows…)

#include stdio.h
#include stdlib.h
#include unistd.h
#include time.h
#include string.h
#include sys/ipc.h
#include sys/shm.h
#include X11/Xlib.h
#include X11/Xatom.h
#include X11/Xmd.h
#include X11/Xutil.h
#include X11/extensions/Xvlib.h
#include X11/extensions/Xv.h
#include X11/extensions/XShm.h

libX11 draws window on the X11 system, libXv do the X-Video work. We use shared memory so shm.h is needed. X-Video works this way:

1. search for available XvPortID and do some initialization.
2. XvShmCreateImage allocates a space from shared memory and attaches it with XvImage object via shmat()
3. Read raw data from super process, ex, file I/O, and use memcpy to copy data from input buffer.
4. call XvShmPutImage to render the data on screen.

FILE* fp;
int size = xv_yuv_image->data_size;
char buf[size/8];
char buf_frame[size];
int buf_read = 0;
char* sFilename = argv[1];
fp = fopen (sFilename, “rb”);

if(!fp){
printf(“file open error!\n”);
exit(-1);
}

memset(buf_frame, ”, sizeof(buf_frame));
memset(buf, ”, sizeof(buf));

while (1) {

buf_read += fread(buf, 1, sizeof(buf), fp);
if(buf_read data, buf_frame, sizeof(buf_frame)) ;

XGetGeometry (dpy, window, &_dw, &_d, &_d, &_w, &_h, &_d, &_d);
XvShmPutImage ( dpy,
xv_port,
window,
gc,
xv_yuv_image,
0,
0,
xv_yuv_image->width,
xv_yuv_image->height,
0,
0,
_w,
_h,
True);

memset(buf_frame, ”, sizeof(buf_frame));
usleep(33333);
}

memset(buf, ”, sizeof(buf));

buf_read = 0;
}

The while loop above iteratingly reads from buffer and call XvShmPutImage to render the video content till EOF. Note that I use a usleep() to control the rendering frequency so it roughly plays like a normal video: 30fps.