Common error messages
- Errors about RTC, the Linux real time clock
device
- Video card memory errors
Capture card problems
- Channels black-and-white and numbers
off-by-one
- Too few buffers with bttv before
kernel 2.4.21
- bttv takes a long time to load
- Audio problems with Prolink PixelView
cards
- Poor performance with the rivatv driver
- Unable to tune to channels using the
bttv, saa7134 or cx88 drivers
Video card problems
- tvtime not starting when using the Radeon
FireGL drivers
- Corrupted YUY2 video with EPIA onboard
video
- Terrible XVIDEO performance with
Radeon cards and XFree86 4.3
- Poor performance with SiS driver in XFree86 4.2.1
and earlier
- XVIDEO always-on-top with some Radeon cards
- Corrupted image borders on NVIDIA cards
in overscan modes
- No XVIDEO support on pre-GeForce NVIDIA
cards with the "nv" driver
- Problems with NVIDIA TNT and TNT2 cards
- XVIDEO initialization problem with the savage
driver
Driver conflicts
- Using overlay mode in xawtv or zapping causes
system instabilities
- apm driver known to cause frame drops
- ntfs driver known to cause frame drops
Common error messages
Enhanced Real Time Clock support in your kernel can help applications
like tvtime improve the smoothness of their output. The kernel module
that provides this feature is called rtc, and it is compiled and
installed by default on most Linux distributions.
This feature will improve the quality of tvtime's output, but is
NOT required for tvtime to operate.
Reasons why tvtime might not be able to use /dev/rtc include:
- Your user does not have read/write access to the device
file /dev/rtc.
- You compiled your own kernel and forgot to enable the RTC device.
- Your user does not have the rights to get high resolution timers.
If your user does not have sufficient priviledge, the RTC device
cannot be used for high-resolution timers. To solve this, you can
do one of the following:
- Run tvtime as root.
- Set the tvtime executable SUID root using this command:
chmod u+s /usr/bin/tvtime.
- Allow user processes the ability to use high resolution timers
by running this command as root every time your machine boots:
sysctl -w dev.rtc.max-user-freq=1024
tvtime uses the XVIDEO extension, which allocates video card
memory. We have experienced some problems where the X server fails
to provide enough video card memory for tvtime to use it. This can
happen if your video card is old and simply doesn't have enough memory
to hold the framebuffer and tvtime at once. However, there are also
bugs in X servers and X drivers that can cause this.
If your distribution is using the
Xorg X server and you
are seeing these errors, please comment on
Xorg bug
474.
Capture card problems
The tuner driver defaults to detecting a PAL tuner on many NTSC
capture cards, a notable example being the ATI TV Wonder cards. This
causes all of the cable frequencies to be out of alignment, channel
numbers are off-by-one and slightly detuned. To fix this, the tuner
type must be told explicitly when loading your capture driver. For
example, with the bttv driver use the following:
modprobe bttv tuner=2
You may have to first remove the bttv module if it is already
loaded. To make this change automatic, in your
/etc/modules.conf file, add the following line:
options bttv tuner=2
This will ensure that whenever the tuner module is loaded, it will
use the correct tuner for this card. If this does not fix the problem
please post a bug report on
the
tvtime bugs page.
In older versions of bttv, this module option could also be passed
to the tuner module. This method is deprecated in newer 2.6 kernels,
and so you must pass the parameter to the capture driver itself.
To attack this problem, we believe that the tuner module must be
fixed to correctly detect the difference in tuner, as so many users
are affected by this problem. To help with this, please follow up on
tvtime bug 711428.
The popular bttv capture
driver only provided applications with two buffers by default in
versions shipped with kernels before 2.4.21. Our advanced
deinterlacing algorithms require a longer history of past input
frames in order to predict motion in the video stream.
To give applications more buffers, use this option when
loading the bttv driver:
modprobe bttv gbuffers=4
To make this change automatic, add the following in your
/etc/modules.conf file:
options bttv gbuffers=4
On some cards without a tuner, bttv can take a long time to load
(a few minutes). If you see this problem, try using this option
in your modules.conf file for loading the i2c-algo-bit
module:
options i2c-algo-bit bit_test=1
If that does not help, please post a bug report.
The bttv driver seems to misdetect Prolink Pixelview cards pretty
badly. There are five Prolink cards listed in CARDLIST for the bttv
driver:
card=16 - Prolink Pixelview PlayTV (bt878)
card=37 - Prolink PixelView PlayTV pro
card=50 - Prolink PV-BT878P+4E /
PixelView PlayTV PAK /
Lenco MXTV-9578 CP
card=70 - Prolink Pixelview PV-BT878P+ (Rev.4C)
card=72 - Prolink Pixelview PV-BT878P+9B
(PlayTV Pro rev.9B FM+NICAM)
The bttv driver cannot seem to autodetect these cards correctly.
We have found that cards which should use card=70 but are
instead loaded using card=72 are causing audio mute/unmute
to fail randomly, or even be inverted, resulting in no audio or just
snippets of audio. To avoid this, please make sure you are using the
correct variant of this card. Specifically, the rev 9D of
the newer Pixelview cards should use card=70 and not
card=72.
By default, the rivatv the driver copies frames from video memory
into system memory, which can harm performance. Performance can be
improved by using DMA to transfer the frames, however, this
option only works when using the open-source "nv" driver in X,
not with the NVIDIA binary drivers.
You can turn on dma support as follows in your modules.conf
file:
options rivatv dma=1
The bttv, saa7134, and cx88 drivers each support a wide variety
of cards which all use the same chip. In particular, these cards
differ in what tuner they use, how many inputs they have, and how
it is configured.
Often, these drivers cannot autodetect the card type, or detect
the incorrect card. To debug this, you must watch your kernels logs
by running the "dmesg" command, potentially loading and
unloading the driver with different options until the driver is
successfully loaded.
Some hints:
- If your card appears as UNKNOWN/GENERIC, then the
tuner driver will not be loaded and the card will likely not
work. You will need to load the driver with the correct card
number.
- If your tuner reports that it is using type -1, it is not loaded
and you will not be able to tune any stations.
- If you are an NTSC user, make sure the tuner you are using announces
itself as an NTSC tuner.
For example, if you are using the bttv driver, the common
procedure for setting up a card is as follows:
- Run "modprobe bttv" with no options.
- Run "dmesg". Check to see if your card is autodetected,
and if the tuner is correct. If everything looks fine, you're done.
- If the card appears as UNKNOWN/GENERIC, find the CARDLIST
file in your kernel documentation and find your card in the list.
- Unload bttv and tuner using "rmmod bttv" and
"rmmod tuner".
- Run "modprobe bttv card=X" where X is the number of
your card.
- Run "dmesg" again. See if the card loaded properly
and if the tuner is correct.
- If not, unload bttv and tuner again, and try specifying the tuner
type as well using "modprobe bttv card=X tuner=Y".
- Curse Linux for being so complicated.
Video card problems
With the Radeon FireGL drivers, the user has the option of having
OpenGL use an overlay surface, giving 3D graphics without tearing,
or video overlay surfaces for multimedia players. This can be set
in the Device section for the fglrx of your
/etc/X11/XF86Config-4 using:
Option "VideoOverlay"
or
Option "OpenGLOverlay"
To use tvtime, you MUST select
VideoOverlay instead of OpenGLOverlay. The
tvtime bug report on this is
bug 787142.
We have received multiple reports that the binary XFree86 drivers
from VIA for the EPIA onboard video have corrupted YUY2 overlay
surfaces, and have problems with high framerate video. We have
also heard that these problems do not exist in the open source driver
in XFree86 CVS. More information on either driver and their status
with tvtime would be appreciated.
We have had numerous reports of horrible performance from
Radeon cards and XFree86 4.3. So far, nobody has been able
to figure out conclusively what is going on here. See the
madness in
XFree86 bug 414.
The tvtime bug about this is
bug 759804.
The XVideo drivers in XFree86 versions up to 4.2.1 exhibit
extremely poor performance. This is fixed in later versions of
the driver. If you own a SiS card, please upgrade to the latest
driver by Thomas Winischhofer, or upgrade to X 4.3 which includes
a recent version of his driver. The web page for the SiS driver
is at http://www.winischhofer.net/linuxsis630.shtml.
The tvtime bug report to on this is
bug
636338.
Around the end of 2002, it was noted that with the ATI Radeon
7000 card and the XFree86 4.2.1 packages in debian, XVideo windows
are "always on top" and cannot be minimized. This was also seen in
RH8.0's XFree86 4.2 with a Radeon 32MB DDR, R100 QD (a 7200
card?). We have not researched this issue enough to know the extent of
what cards are affected, or what versions will fix this. More
information on this would be appreciated.
There is a bug in versions of the NVIDIA driver before
1.0-4349 and the nv driver in 4.3.0 and earlier for cards
which support wide filter kernels for high quality XVideo surfaces. If
the overscan value is larger than 0, then the card uses memory outside
of the image for interpolating around the edges, causing corruption all
around the outside of the tvtime window.
NVIDIA was made aware of the problem and fixed it for driver release
1.0-4349 and in the nv driver in CVS XFree86. For
reference, the tvtime bug on this was
bug
694144.
The open source NVIDIA driver for X ("nv" not "nvidia") does not
support XVideo surfaces for pre-GeForce cards like the TNT2.
To use tvtime with these cards, you must use the binary drivers.
The binary drivers give better performance for XVIDEO than the
open-source driver, since it can use the kernel module component to
negotiate DMA transfers for video frames. Where possible, stick to
the binary drivers when using tvtime on NVIDIA hardware.
Corrupted frames
The TNT and TNT2 cards are very limited in their bandwidth, and this
problem will appear as horizontal lines in the video output unless
the monitor is at a low resolution. Smaller output window sizes
will make the problem worse, as the DAC will have to read out more data
in the same amount of time. This problem should not occur with any of
the GeForce series cards.
Card cannot downscale
A user of a NV5: RIVA TNT2 Ultra (rev 11) noted that the card
cannot downscale video, it only crops it. This is apparently a
hardware deficiency of many TNT cards, and the XVIDEO API provides
no way to detect this case.
Poor performance at high resolution and refresh rate
A user of a TNT2 Vanta card reported that blit performance
was absolutely terrible until they went down to a resolution of
800x600, at which point speed quadrupled. We believe this is due
to bandwidth problems on these cards.
We've investigated a bug where the savage X driver does not
initialize its XVideo subsystem unless it starts on a modeline where
the resolution equals the virtual resolution. That is, if you have
a virtual resolution of 1600x1200, make sure you start on a
1600x1200 mode, otherwise XVideo won't initialize.
This bug was noted on the XFree86 devel list.
Driver conflicts
Overlay mode, as can be used in xawtv or zapping, causes system
instabilities. This is because their 'overlay' mode allows the capture
card to write directly to video memory without locking it first, which
is unsafe for many video cards, specifically those which do not use a
linear framebuffer (such as NVIDIA cards with the binary drivers).
Use of the xawtv/overlay mode has caused system crashes, and
specifically, can cause tvtime to crash if it tries to use XVideo
features after the framebuffer area has been destroyed by the capture
card (see
tvtime bug 702539).
A workaround for having a no-CPU-access mode is to use the XFree86
"v4l" extension, which allows safe use of the framebuffer by capture
cards. tvtime does not support using this mode, nor do we plan to,
as we would not be able to composite our OSD on top of the video
signal or perform any processing of the video.
We had a problem report that in RH9 (later 2.4 kernel), the
GNOME battery applet was polling apm every few seconds. This caused
the system to block for 50-100ms causing stutters in video
playback.
We had a problem report that having gkrellm running would cause
frame drops in tvtime. We believe this was due to the ntfs driver,
as the frame drops occurred whenever gkrellm was monitoring the
users' other disk that had ntfs partitions. See
tvtime bug 634674.
|