Forcing web pages to anti-aliase


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Forcing web pages to anti-aliase
# 1  
Old 08-11-2009
Forcing web pages to anti-aliase

Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings.
It appears however that fonts are not automatically anti-aliased on some web pages, even when anti-aliasing is switched on, why? Moreover, when changing the font to something other than black (#000), say darkgrey (#333), anti-aliasing is effected, why?

Note: there is a marginal performance decrease when anti-aliasing is switched on, which on modern day computers is so immeasurably small, that the opt-in is actually riddling me more, because it might as well be switched on by default, but perhaps that is distribution dependent.
 
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Get web pages and compare

Hello I'm writing a script to get content of web pages on different machines and compare them using their md5 hash hear is my code #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054" CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies

2. Shell Programming and Scripting

Get web pages and compare

Hello, I'm writing a shell script to wget content web pages from multiple server into a variable and compare if they match return 0 or return 2 #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv1 srv2 srv3 srv4" CLUSTER1_APPLIS="test/version.html test2.version.jsp" # Liste des... (4 Replies)
Discussion started by: gtam
4 Replies

3. Shell Programming and Scripting

Checking Web Pages?

Hey guys, Unfortunatley, I can not use wget on our systems.... I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application. Has anyone saw this before? Thanks, Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies

4. UNIX for Dummies Questions & Answers

curl command with web pages

I can't quite seem to understand what the curl command does with a web address. I tried this: curl O'Reilly Media: Tech Books, Conferences, Courses, News but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies

5. Shell Programming and Scripting

Investigating web pages in awk

hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies

6. UNIX for Dummies Questions & Answers

Browse Web pages through command line

Is there any way to browse web pages while on the command line? I know wget can download pages, but I was wondering if there was an option other than that. (2 Replies)
Discussion started by: vroomlicious
2 Replies

7. UNIX for Dummies Questions & Answers

Selecting information from several web pages...

Hi All! Is this possible? I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these... http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies

8. Shell Programming and Scripting

Count links in all of my web pages

Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" . i want the output to look something like this: Total number of web pages: (number) Total number of links: (number) Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies

9. UNIX for Dummies Questions & Answers

Dynamic web pages for Unix Web Server

Hi, my company is considering a new development of our web site, which used to run on Apachi over Solaris. The company who is going to do this for us knows only about developing it in ASP. I guess this means we'll have to have another ISS server on NT for these dynamic pages :( What are... (5 Replies)
Discussion started by: me2unix
5 Replies
Login or Register to Ask a Question
robot-playernav(1)					      General Commands Manual						robot-playernav(1)

NAME
robot-playernav - GUI client to control over localize and planner devices SYNOPSIS
robot-playernav [options] <host:port> [<host:port>...] DESCRIPTION
robot-playernav is a GUI client that provides control over localize and planner devices. It allows you to set your robots' localization hypotheses by dragging and dropping them in the map. You can set global goals the same way, and see the planned paths and the robots' progress toward the goals. robot-playernav can also display (a subset of) the localization system's current particle set, which may help in debugging localization. You can think of playernav as an Operator Control Unit (OCU). robot-playernav can also be used just to view a map. OPTIONS
-fps dumprate dump screenshots rate in Hz (default: 5Hz). -zoom zoomlevel initial level of zoom in the display (default: 1). -aa {0|1} use anti-aliased canvas for display (0 == false; 1 == true). The anti-aliased canvas looks nicer but may require more processor cycles (default: 1). -map map_idx the index of the map to be requested and displayed (default: 0). AUTHOR
Player was written by Brian Gerkey <gerkey@users.sourceforge.net> and contributors. This manual page was written by Daniel Hess for the Debian Project. SEE ALSO
The HTML documentation in /usr/share/doc/player/html of the robot-player-doc package. Player May 2009 robot-playernav(1)