HanishKVC’s General Blog zone

June 7, 2010

Apples 2010 New iPhone 4 and the new stories

Filed under: Android,Apple,Blogroll,General,India,life,Nokia,technology — hanishkvc @ 8:25 pm

Tracking Steve y from wwdc 2010 on engadget. For what else but to see what new stories they come up with this time.

As usual there is the marketing sweet talk from Stevey. Now if you cut all the crap and look at reality the buckets are

**** Good (not necessarily first, but still good to see)
* using frame as part of antenna system
* not blindly uping the megapixy but do sensible thing ie keep megpies in check but try and improve quality (not sure how many remember Palms vga camera in their mobile which was a similar strategy from them eons back ie quality over blind quantity )

**** the gas (nothing new, but as usual nice stories/spin from Steve)
* retina display – natural progression from the already existing 800*480 phones from other vendors
* A4 – good Arm core like any other current gen Arm core
* Multitasking – finally some sense went to appy, but still some more work left
* ibook – ebook reader, less supported format than others out there. And potentialy locked. (My prayer, Oh God, don’t let Apple fanboys kill the EPaper based ebook readers by the stupid Buzz that Apple and some of their blind zombies generate)
* video calling and 2nd camera – even in India where iphone comes 1 year late and almost locked and at a premium, NOKIA had the support for standards based video calling even before 2007 (ie even before apple iphone was born)
* iads – ads been there for ages, and even interactive ads have been there for long time in other CE device domains and in mobiles sometime back
* thinner phone – natural progression from all companies. And don’t forget no physical keyboardy.
* video editing – as usual new in apple universe, but pretty old in other vendor mobiles
* Gyro – natural progression, already used in other device domains

Conclusion, Nothing new or great here, but still have to give it to them for taking the less traveled path wrt Antennae design and Camera.

Rather than the stupid marketing spin of Retina display, they could have been technically path breaking if they had actually brought something like the PixelQi LCD+EPaper display into their product.

Have to give it to Google, as it stands now, they are in a better position with their Android platform in the mobile phone market. And Nokia, please wake up and come out with some good high end phones including Linux based (Meego or Android (why not), now that you have killed Maemo).

(Tried Google buzz for Appy Steveys’ 4th iPhony, updated a bit for here)

Advertisements

February 28, 2010

Android AOSP for G1 ADP1 HTC Dream

Filed under: Android,Blogroll,linux,OpenSource,technology — hanishkvc @ 10:56 pm

Android AOSP for G1 / ADP1 / HTC_Dream
v01March2010, HanishKVC, Feb2010
===========================

This document gives the Steps required to build Android AOSP for ADP1/HTC Dream/G1 phone.

>> Current status: Basic kernel (boot), wifi and system image is working.
Not sure of 3D and Calender,Contact(with Google sync) [Googleloginservice?] <<

Building Android Donut release using
the Ubuntu Karmic (9.10) i386 as the development / host machine
————————————————————————————————

***N*** Preparing the system

We require to get java5 jdk, which is required by Android, installed into Ubuntu 9.10. But by
default 9.10 comes with Java6, so we pick up the Java5 from 9.04. To do this we require to
add the below lines to /etc/apt/sources.list as a root user (or by using Software sources gui).

Also we require to setup a udev rule to help with debugging of the Android devices,The example
below is for HTC devices, for other manufactures, replace with appropriate vendor id.

$sudo bash
# echo “deb http://mirrors.us.kernel.org/ubuntu/ jaunty multiverse” >> /etc/apt/sources.list
# echo “deb http://mirrors.us.kernel.org/ubuntu/ jaunty-updates multiverse” >> /etc/apt/sources.list

# echo ‘SUBSYSTEM==”usb”, SYSFS{idVendor}==”0bb4″, MODE=”0666″‘ >> /etc/udev/rules.d/51-android.rules

# exit
$ sudo update

Next we require to Get the required build and support applications installed

$ sudo apt-get install git-core gnupg sun-java5-jdk flex bison gperf libsdl-dev libesd0-dev libwxgtk2.6-dev build-essential zip curl libncurses5-dev zlib1g-dev

***N*** Getting the repository and the code, setup

We require to install the repo program, which is used by Google as the application to manage their code repository and work flow.

$ mkdir ~/bin
$ export PATH=~/bin:$PATH

$ curl http://android.git.kernel.org/repo >~/bin/repo
$ chmod a+x ~/bin/repo
$ mkdir ~/work/aosp_donut
$ cd ~/work/aosp_donut

$ repo init -u git://android.git.kernel.org/platform/manifest.git -b donut

NOTE: If you want to work with the master branch, then don’t give -b option to repo init
NOTE: However I haven’t had success with master build as it stands on 22Feb2010, have to check why.

>>ALTERNATIVE FOR Cyanogen’s repo, haven’t tried yet<< $ repo init -u git://github.com/cyanogen/android.git -b donut

$ repo sync

***N*** Getting and preparing the proprietry stuff from HTC

FILE1: htc-adp1.sfx.tgz

This file is available has “HTC Proprietary Binaries for ADP1” from
http://developer.htc.com/

Download this file and copy it into ~/work/aosp_donut/vendor/htc/dream-open

FILE2: signed-dream_devphone_userdebug-ota-14721.zip

This file is available from
http://developer.htc.com/adp.html

Download this file to the root of your android repository i.e ~/work/aosp_donut

$ cd ~/work/aosp_donut/vendor/htc/dream-open
$ tar -zxvf htc-adp1.sfx.tgz
$ ./htc-adp1.sfx

Note: Not sure if htc-adp1.sfx is required, because looking at unzip-files.sh, it seems like it should work even with out this??? Have to check

$ ./unzip-files.sh

***N*** Fixing some bugs in code (rather stricter compiler related issues)

ISSUE 1:
development/emulator/qtools/trace_reader.cpp:1012: error: invalid conversion from ‘const char*’ to ‘char*’
development/emulator/qtools/trace_reader.cpp:1015: error: invalid conversion from ‘const char*’ to ‘char*’
SOLUTION: Replace the char* defs with const char* definitions.

ISSUE 2:
development/emulator/qtools/dmtrace.cpp:166: error: invalid conversion from ‘const char*’ to ‘char*’
development/emulator/qtools/dmtrace.cpp:183: error: invalid conversion from ‘const char*’ to ‘char*’
SOLUTION: Type cast the assignments with (char *)

NOTE: donut_plus_aosp – seems to fix these bugs, but has other large changes also, haven’t tried it yet.
NOTE: Also not sure wrt Video/Audio codec h/w accel optimized modules in donut_plug_aosp

***N*** Building the code

Now that the source code is available and setup with propriotory binary stuff from htc, let us start the actual build

$ cd ~/work/aosp_donut

$ source build/envsetup.sh
$ lunch aosp_dream_us-eng

$ make -j4

Now the generated files (boot.img, recovery.img, system.img, userdata.img) will be available at out/target/product/dream-open/

***N*** Check out this Source compiled User space (system.img) with prebuild kernel based boot.img

$ cd ~/work/aosp_donut/out/target/product/dream-open
$ rm recovery.img <=> In case you want to replace it with your favorite recovery image
$ cp /path/to/recovery_cyanogenmod_amon_ra.img recovery.img   <=>  This replaces the default recovery image with one which you want

Put device into FASTBOOT mode (reboot/power_on the device with BACK key pressed, it should show the driods on skateboard with fastboot text displayed)

$ fastboot devices
$ fastboot erase userdata   <=>   similarly boot and cache
$ fastboot -p dream-open -w flashall
$ fastboot flash userdata userdata.img
$ fastboot reboot

***N*** Compiling android kernel source for Dream/Adp1/G1

*** Get the kernel source
$ cd ~/work/kernel

$ git clone git://android.git.kernel.org/kernel/msm.git
$ cd msm
$ git branch -r
$ git checkout -b android-msm-2.6.29 origin/android-msm-2.6.29

*** Setup the path to point to appropriate compiler tool chain

$ export PATH=$PATH:~/work/aosp_donut/prebuilt/linux-x86/toolchain/arm-eabi-4.4.0/bin

*** config the kernel

OPTION 1: Get the default config options for the kernel from what is already specified in kernel source wrt adp1

$ make msm_defconfig   <=> Rather use the next command to be safe
$ make ARCH=arm CROSS_COMPILE=arm-eabi- msm_defconfig

OR
OPTION 2: Get the config from a phone running android

$ adb pull /proc/config.gz .
$ gunzip config.gz
$ mv config .config

*** Build the kernel

$ make ARCH=arm CROSS_COMPILE=arm-eabi-

The kernel will be in arch/arm/boot/

*** Move the kernel to Android platform source directory

$ cp ~/work/kernel/msm/arch/arm/boot/zImage ~/work/aosp_donut/vendor/htc/dream-open/kernel

Now building the android platform will use this new kernel to build the boot.img and recovery.img
NOTE: Differ building the android platform, if you want to update wifi module (wlan.ko) also

*** Build the wifi module to match the new kernel and copy it to appropriate platform directory

$ cd ~/work/aosp_donut/system/wlan/ti/sta_dk_4_0_4_32
$ export KERNEL_DIR=~/work/kernel/msm

$ make  <=> OR the next line
$ make KERNEL_DIR=~/work/kernel/msm ARCH=arm CROSS_COMPILE=arm-eabi-

$ cp ~/work/aosp_donut/system/wlan/ti/sta_dk_4_0_4_32/wlan.ko ~/work/aosp_donut/vendor/htc/dream-open/

NOTE: Looking at the comment in the wlan/ti/ … directory, there seesm to be another way
of getting this to autocompile but at this time I am not sure how that will work out

Now building the android platform will use this new wlan.ko to build the system.img (/system/lib/modules/wlan.ko)

*** Building the Android platform with new kernel and wifi module (already copied to the required locations)

$ cd ~/work/aosp_donut
$ source build/envsetup.sh
$ lunch   <=> remember to select appropriate target
$ make -j2

***N*** Burning the new images

$ export PATH=~/work/aosp_donut/out/host/linux-x86/bin:$PATH

Boot the device into fastboot mode by powering on the device by holding BACK button pressed.
You should see the droids on skateboards and also fastboot specified on the screen
Connect the usb cable between device and host, if not already done so

* On the host pc do (IF ONLY UPDATING boot.img)

$ fastboot devices  <=> This should list your device
$ fastboot erase boot
$ fastboot flash boot boot.img
$ fastboot reboot

* On the host pc do (IF burning everything i.e all imgs (boot,recovery,system,userdata)

$ fastboot devices
$ fastboot erase boot
$ fastboot erase userdata <=> This is just in case for future
$ fastboot erase cache <=> This is just in case for future
$ fastboot -p dream-open -w flashall
$ fastboot flash userdata userdata.img
$ fastboot reboot

* Now the phone should have rebooted into Android with the new kernel/system which we just burnt. Check it out by looking into Settings->About phone

$ adb devices  == This should list your device
$ adb shell    == Now you have root access to your device

***N*** Burning into G1 (using recovery image)

Copy the boot.img and system.img to root of sdcard in G1
Boot your G1 into recovery mode by powering on the G1 with HOME button pressed.
Getting into the console.

$ mount /sdcard
$ flash_image boot /sdcard/boot.img
$ flash_image system /sdcard/system.img

NOTE: This didn’t work. May be because I didn’t erase userdata and load new userdata ???. Have to check this later.

Repo querieis
————————-

** I don’t see a direct way of finding which branch is setup by repo init, other than by looking at .repo/manifest.xml file. Am I missing something, or is it how one looks.

** I see 2 different manifest files for donut. And am not able to find a simple way of telling which of the donut manifests is used by default and how to change it if required. Or am I interpreting something wrongly here.

** Also no direct info has to what release tag/branch corresponds to what at a high level (like difference between donut and donut_aosp_plus – also had to look at .repo/manifests.git/FETCH_HEAD to find the branches available)

** Also the google repo usage document assumes that the developer understands git/repo too much (i.e a novice will require to dig lot more). They could help by making the document bit more verbose and or more importantly adding some use cases.

Useful links
———————————–

http://source.android.com/download
http://source.android.com/documentation/building-for-dream
http://wiki.cyanogenmod.com/index.php/Building_from_source

SO MANY OTHER WEBSITES
kaushikdutta
http://ctso.me/2010/01/building-an-android-rom-part-1/

Experiment status till date
—————————————————-
>>Attempt 1: FAILED<< when trying with the Google android.git.kernel.org repository
a1.1 – Used master branch instead of donut
a1.2 – Did not use htc-adp1.sfx.gz (because unzip-files.sh seems to work around this)
a1.3 – Used the prebuilt kernel which is already part of the repository. And which in turn is automatically used to create boot.img
a1.4 – used flash_image boot boot.img and flash_image system system.img from cm’s recovery console
a1.5 – targetted aosp_dream_us-eng

>>Attempt 2: Basic kernel(compiled locally), wifi (locally compiled) and system apps running << Tried with donut branch this time.
a2.1 – Used donut branch

>>Attempt 3: Have to try donut_plus_aosp<<

>>Attempt 4: ToDo << Later have to try with cyanogen’s repository.
When trying to search for any steps to get aosp compiled for G1, I have come across few pages by ctso also (which is linked above).
As I have some issues with Cyanogenmod build, I want to stick with Google aosp code for now. So haven’t given Cyanogen’s repository a try yet.


December 6, 2009

The Bad and the Good, ARM Netbook will be WORSE wrt Open source

Filed under: General,OpenSource,technology — hanishkvc @ 5:14 pm
A debate has been going on in the Linux community wrt Intel, GMA500 (Poulsbo, PowerVR) and the pathetic support for the same. Some felt (or may feel) that the potential ARM based netbooks could solve their Problem, NOW truth couldn’t be more contrasting than this, so I posted a response on a related and good article requesting Intel to come to their senses on Linux Journal. I am publishing my response to “that article and the comments it generated“, here
**** My comment FROM Linux Journal site ****
On December 6th, 2009 hanishkvc (not verified) says:
Hi All,
To summarise things as it stand today wrt Netbooks/UMPC/MID market and TRUE Open source and Intel Vs ARM.
a) Processor
—————-
* Arm is Good at Power consumption but bit low on Performance
* Intel is good at performance but bit low on Power compared to Arm levels
But given the tradeoffs, it is to be expected. Intel Atom is slowly moving towards the Arm territory with respect to Power AND Arm is slowly moving towards the Intel territory wrt Performance wrt the Cortex AX cores relative to Mobile platforms like Netbooks.
BUT coming to FREE AVAILIBILITY of DOCUMENTATION of their Chips and its associated peripherals, Intel is slightly better than ARM Inc (Where is the up to date ARMARM document ARM Inc wrt latest Arm versions ???, SO TRUE OpenSource People PLEASE DONT ASSUME that Arm is better and one should switch from Intel to Arm (what a person mentioned above), you cann’t be more wrong, wrt what we want ultimately.
b) Core Chipset (All IO in General)
————————————
Keeping power consumption in mind, today Arm based SOCs are better integrated and flexible wrt all io modules be it ram, graphics, video, 3d, audio, expansion (serial,parallel buses), core support logics (Timers, IntCntr), storage cards AND finally additional integrated coprocessors.
Intel was stuck with bad companion chipset combinations for such low power requirement products like Netbook (i.e provided you want Netbook to be full day computing) so they had to come out with a low power companion chipset for Atom (Be it N series or Z series) AND the FIRST step in that direction from Intel is what US15 (Poulsbo) is and that in turn uses GMA500 (PowerVR). They have still bit more distance to travel to match ARM SOCs here, but atleast it is a start.
Hopefull either
* they will be able to get support for GMA500 added to Mainstream linux thro either a regularly updated binary driver or better a open source driver or best case being open up the documentation for it.
* OR replace PowerVR 3D+Video core with their own in the future generation of the Atom companion chipsets or the Integrated CPU+GPU soc (the current one is still PowerVR).
DONT FORGET THAT Majority of the ARM SOCs out their use the PowerVR 3D + Proprietry (Dependent on who makes the SOC) Video accelerators. So A ARM based Netbook will be AS BAD OR WORSE THAN the current Intel Netbook platform AS FAR AS TRUE OPEN SOURCE people are concerned.
OR LUCKY for all, Imagination Opens up the documentation or releases a Open source driver for 2D, 3D and Video logic of their logic.
**** END of my comment from Linux Journal site ****
Update: Rather I forgot one more important related point while posting at LJ, it is -> If we go further and look at free documentation for the Arm based SOC, then things will turn out to be much worse in general (there are exceptions – where certain SOC vendors have provided good documentation for the basic SOC part, but even they leave out the powerful features of their SOC from documenting freely) so we still have some distance to go before we can say Open source and ARM based Netbooks in the same breadth.

May 17, 2007

BeYOND stupidity SONY, do LOOK = Come to your senses, allow Indian PS3 Owners access to Playstation Store +++, Link India PS site from global

Filed under: gaming,General,India,life,protest,PS3,Sony,technology — hanishkvc @ 8:39 pm

Hi Sony / Sony India if you are listening.

  1. At first you charge us a lot for your PS3 (among the highest rates in the world for PS3)[I mind(50-50), luckily I got it when coming from US],
  2. next you give us a very small set of Games for PS3 [I mind it very much, its bad],
  3. On top you don’t allow us access to Playstation Store[I mind it very very much, its daytime robbery],

WHAT LOGIC ?e?IS THIS.

Please look BeYOND your foolish/senseless/stupid attitude and logic and correct these mistakes. Atleast if you can’t fix <1>(cost of PS3) above, then atleast because Indians are paying you a lot more compared to many other regions atleast fix <2> (game library) and <3> (PSN store++) above.

Similarly in future once Playstation HOME debuts don’t repeat the same mistakes. Let Indians participate in it from Day one.

Also I am also a Sony PSP owner in India and even Sonys’ own stores don’t have good library of games for it also.

Last but not the least why DON’T we have a link to India PS3/PS2/PSP site from playstion.com / asia.playstation.com.

Come on wake up and be a good CE company and be good to your customers.

Note: I am a really frustrated Sony Customer in India with a Sony Camera, PSP, PS3. So when it came the time to buy a HDTV for my PS3 I went with Samsung. Hint Hint …..

Note: Only reason I went with PS3 instead of XBox360 inspite of all your stupidities was your support for Linux and Cell, but that doesn’t mean you can take me for a_ride/granted for ever.
Note: I am posting it in some not directly related categories also, so that hopefully I get the message across to Sony. Sorry for that.

May 16, 2007

Short and simple commandline Bluetooth in any new Linux distros

Filed under: bluetooth,debian,linux,Nokia,OpenSource,technology — hanishkvc @ 7:22 pm

Yesterday I had to transfer some files/S60 Opensource programs to my Nokia 6630 mobile and so picked up my usb bluetooth dongle (after ages) and connected to my Linux PC to achieve the same. I had forgotten the things which I had done long time back to get it working (Also one of these days I have to find out where I had noted those steps down).

Either way I started by remembering that I have to try and use obex logic to put those files on the mobile (now come on remembering that isn’t that difficult;-). Soon I remembered most of the things to do through aptitude search/show bluetooth/bluetooth packages, dpkg -L <bluetooth related packages>, some trail_N_error and net searching (googling).

But to my horror what ever I do the connection wouldn’t establish has the bluetooth stack on the PC wasn’t pickup the PIN which I just configured on the PC. After some more rtfm and dpkg -L bluez-utils and cross verification on the bluez website I realised that the way the PIN to be used is specified to the bluetooth stack has changed on the PC and now instead of the pin_handler it uses a dbus based passkey handler. So I compiled the given passkey_agent.c and resolved it. And thus could achieve the file transfer without going into windows thou with some deficit of sleep 😉

So here are the commands one could use to work with bluetooth devices in a linux based pc =>

hciconfig
– Gives info about the bluetooth hci on your pc
– Ensure the device is up and running and has required scan modes
– hcitool dev should also give some of this info

hcitool inq and hcitool scan
– Gives info about or rather identifies nearby bluetooth devices

hcitool info <BTAddr>
– Get info about remote bluetooth device

l2ping <BTAddr>
– One way to see if we can communicate with a remote bluetooth device

sdptool browse <BTAddr> or sdptool records <BTAddr>
– Gives info about the services provided by a remote bluetooth device

obexftp –nopath –noconn –uuid none –bluetooth <BTAddr> –channel <OPUSHChann
elNo> –put <FileToPut>
– Allows one to send file without specifying the pin on the remote device side
– The OPush channel number for device is got from sdptool above

passkey-agent –default <Pin>
– Pin specified here is what the remote BT device should provide
or its user enter on that device when requested.

obexftp -b <BTAddr> -v -p <FileToPut>
– Allows one to put a file onto the specified BT device
– obexftp could also be used to get or list the files on the BT device
– also allows one to identify a nearby BT device by just giving -b option

obexpushd
– Allows one to recieve files sent from a bluetooth device.
– Depending on who started it, the recieved files will be stored in the corresponding home directory

Note: The old style pin_handler doesn’t work with latest bluez, you require a
dbus based passkey handler and there is one provided by default by bluez-utils
called passkey-agent
Hope this helps anyone who is trying to use bluetooth devices from the commandline on a new linux distro, as well as it would help me to remember for the future for my own use.

March 11, 2007

Finally Got my PS3, runs in india and runs linux but no mp2 and no timed poweroff. Also allow Indians to register on PSN

Filed under: gaming,General,India,life,OpenSource,PS3,Sony,technology — hanishkvc @ 5:45 pm

As I had thought sometime back i.e I might buy a PS3 when it comes out in India for both its gaming features as well as its Media center kind of capabilities (especially for my family, i.e easy to use) and more importantly for me its Computing capabilities.

Lucky me that I had to go to US for a short trip last month and in the process while coming back picked up a PS3 premium edition from atlanta (Just for Sony to know, I found 3 to 4 PS3s’ available at EBgames shop with the shop guy telling that its been there for some time and it is not moving, so can I look at getting around $3K or so wasn’t it what sony us president had told he would give people who find PS3 in stock for a long time). Thus I am saved from having to pay additional premium in India when it releases (when ever it is, no info from Sony about it still, this is bad) and also even thou sony doesn’t mention it IT DOES HAVE A UNIVERSAL POWER SUPPLY (ATLEAST the JAN2007 model which I brought), I took a chance (calculated) and it did run without requiring any stepdown transformers.

I will be able to experiment with its PSP interfacing capabilities, rather once I find time to upgrade my PSP to 3.x firmware obviously the non native way (i.e not direct upgrade because I love homebrew).

I tried installing Linux (the YDL version) and it does work. I have a problem with the mouse (rather a ps2 mouse through a USB to PS2 converter device) which I have to fix later. Otherwise it does run fine. Sometime later I would rather come out with a fully ramdisk (initrd) based small linux distro for it with some key utilities which I use. With it using kboot as its otheros bootloader, it shouldn’t be that difficult. Also as I already have native compiler of PowerPC running on PS3 I don’t have to mess with cross compilers also (not that I don’t like it or any such thing, just lazy these days).

If anyone is interested the kboot customisation for PS3 or rather the ps3 linux devkit full cd is available for kernel.org (as well from dl.qj.net, but qj.net had download issues, as it is not simple wget for qj.net links, and power was failing every time I was trying to download here, so making it hard for me to continue a partial download).

I tried Resistance and RR7, I liked both overall. However I did find that PS3 doesn’t support mp2 (not layer3 (i.e mp3) but layer2) decoding while a $50 (Rs2000) dvd player supports it. Also one more thing I wanted was a timed poweroff feature so that I could let it play some songs or so and after a given time it poweroff itself, I didn’t find this feature in PS3. It would be usefull if sony could add these features to PS3.

Also if sony could release direct dev tools for PS3 i.e not for Linux environment but for their native PS3 os, like MS then it would be even more fun. Either way I do appriciate Sony PS3 allowing linux to be installed i.e one of the reasons I bought PS3.

SONY ALSO Please do allow us indians to register on your playstation network with our indian address if you can. I legally brought a product from US on my own (also brought 2 game cds along with it, I think better than what most US and Japan customers have done on a average till now), but I will be using it in India, cann’t I register it with my Indian address (which you don’t allow for now in ur website).

November 29, 2006

Running Mupen64 in Fedora Core 6

Filed under: emulation,fedora,gaming,n64,nintendo,OpenSource,technology — hanishkvc @ 4:34 pm

Being a emulation/simulation fan, I recently moved my interest from nes and snes(zsnes and snes9x) to n64 (nintendo 64). On searching the net I found that the only relatively active open source project currently for n64 emulation is mupen64 released sometime last year.

I downloaded the source and binary versions. Initially trying it on FC6, I found issues with both the mupen64 which I compiled and later also with the binary version directly downloaded. After some breaking the head and some trial and error, this is what I had to do to get Mupen64 running on FC6 using a ATI 3D acceleration driver (i.e the ATI’s drivers and not the one from Xorg). Also what I noticed is that the Mupen64 works BETTER | PROPERLY once I use the ATI s driver instead of the XOrg s driver.

  1. **Fixing the SELinux issue with plugin**
      chcon -t textrel_shlib_t /pathto/mupen64-0.5/plugins/*so
  2. **Fixing AIGLX and Composite issue with ATI driver(Cas I use ATI drivers)**
    Current ATI accelerator drivers don’t support AIGLX/Composite along with DRI. So One is
    required to disable AIGLX and Composite features in X Server, if one wants to get OpenGL
    acceleration in Mupen64 graphics plugins. This is _essential_ if one wants good speed
    during emulation.
    #** Put the following into your /etc/X11/xorg.conf **
    Section "ServerFlags"
    Option "AIGLX" "off"
    EndSection
    Section "DRI"
    Group 0
    Mode 0666
    EndSection
    Section "Extensions"
    Option "Composite" "off"
    EndSection
  3. ** Make Mupen64 use ATI (Your 3D H/W based) libGL instead of MesaGL s **
    The ati h/w based libGL doesn’t install into /usr/lib/, but rather installs into /usr/lib/ati-fglrx.
    This creates problem because the MesaGL’s libGL is under /usr/lib and by default any
    program will pick this up instead of your h/w accelerated libGL. I THINK FC6 (livna guys)
    should fix this at the package level. However On trying to fix it by forcing mupen64 to use
    the proper libGL using /etc/ld.so.conf.d/prgname logic, it failed. I didn’t try breaking my
    head as it was already 3 or 4 am in the morning and I still had games to try. So I worked
    around this by creating a symbollic link to hw based libGL in the mupen64 directory and
    using LD_LIBRARY_PATH to force the use of proper libGL as shown below.

    • cd /pathto/mupen64
    • ln -s /usr/lib/ati-fglrx/libGL.so.1.2 libGL.so.1
    • export LD_LIBRARY_PATH=.
    • ./mupen64 (NOTE: Now you should be happily running mupen64 in FC6 with acceleration)

If you have taken care of the 3 issues mentioned above, Now you should be able to happily enjoy running Mupen64 on FC6 with 3D acceleration for graphics. Enjoy.

October 20, 2006

My NEXT PC might be a PS3

Filed under: General,life,OpenSource,PS3,technology — hanishkvc @ 4:06 pm

From sometime now I have been thinking of upgrading my computer (interms of CPU and GPU) and or buying a laptop. However because of various reasons I haven’t been able to do both.

Pricy PCs  (well in a way):

Well wrt CPUs now that Core 2 Duo is out the technology advance issue is resolved now however prices being bit high here in India, I am waiting out for the prices to drop. Now coming to GPUs with the availability of Unified Shader logic (rather more generically the flexible pipeline and flexible programmable nature) in the newly released and to be released GPUs, the technology advance I have been waiting for will be finally resolved, but then again the pricey nature of GPUs (rather more severe in case of GPUs when compared to CPUs in India) makes it hard to buy them.

Consoles to the rescue:

So assembling the PC hit by pricey nature of new CPUs and GPUs, the other alternative I can look at is buying consoles (I mean Game consoles).

The price issue becomes less of a issue as the vendors normally try to sell the console at a less cost or atleast less margin of profit from a h/w perspective and then try to make money on the games that are sold.

Coming to technology, Well at a technical level a game console is also a computing device with similar capabilities to a PC. However over the last few generations of the Game console the vendors have been aiming them as pure gaming devices and not letting the users utilize the full power hidden in these products. But things are changing.

XBox360 is a good console with 3 core PowerPC asic (with each core supporting 2 threads) and a unified shader based gpu from ati. It has support for storage (HDD), networking (ethernet and wlan) and external expansion(USB, Bluetooth) . And it has sufficient Memory (512MB). However the problem here is that Microsoft (i.e the developer/vendor) doesn’t want XBox to be used for anything other than Gaming in principle. There is some hope to circumvent it legally to some extent by using the XNA Express developer framework however this can only be at the application level and here to only interms of managed code.

PS3 from Sony on the other hand has the Cell processor (a PowerPC core + 8 Specialised Processing elements (mini cpus))(having architected embedded products which use multicore ASICs involving a General purpose CPU and or DSP and Specialised Processing elements from other Chip vendors, I am pretty happy and looking forward to all the possibilities in these specialized ASICs with seemingly limited resources)  and a GPU from NVidia (hoping against hope that it will have a surprise interms of unified/flexible shaders, even otherwise to some extent it is still ok) . It also has storage (HDD, Flash cards), networking (WLan (and ethernet???)), expansion (USB, Bluetooth, Flash card interfaces (SD/…)) and Memory (512MB). And TO TOP IT UP Sony is WILLING to let Users UTILIZE the Computing Power of their Console for what ever the User fancies. And inturn they will DO IT IN STYLE by USING /EXTENDING OPEN TECHNOLOGIES like Linux, OpenGL, OpenXYZ, GCC, open source applications (belonging to many/any domain).

So PS3 presents itself to be a good Gaming Console as well as a good General/Special (thanks to Cell) Computing device. Even thou it might be pricy in India at the begining (only sometime around 2007 mid, if I may fancy) still I would consider it a better pricy thing to buy rather than a pricy pc(which inturns I have to keep upgrading atleast every ~1.5 years if I want to play the latest and greatest game in its full glory).

One more reason for tilting towards a Game Console (PS3 in this case) being that Games will be specifically optimized to utilize the the full power of the console to give the best results. Also games will be available for the console for atleast 4-5 year period without requiring to upgrade(rather change) the console, which is not the case in PCs (i.e if you want to fully see what the game developer wants you to see).

Consoles which I may use for similar purpose but not happy fully:

Nintendo Wii: Again uses a simple PowerPC core  and a simple GPU. Other than only haveing simple CPU and GPU, it has a proprietry DVD format if I am not wrong. Again native availability of a flexible computing enviornment with option to add ones own modules/applications is questionable at this stage. One could always hack these to get these capabilities and there will surely be communities on the internet to help allow that, but then it is a different story altogether.

Last generation – PS2 and XBox: At this point in time the Computing power/capabilities of these products will fall below what a average user/developer might expect. Again among these PS2 would be the one I would pick if I have to because (a) it has official support to experiment with linux and (b) it is the non standard platform design (which is what I like and would love to experiment with) compared to Xbox (a PC at a basic level).

October 4, 2006

My bit old but true thoughts on WinVista – Nothing radical compared to other OSs around, just SP3plusplus stuff

Filed under: Microsoft,technology — hanishkvc @ 1:38 pm

Sometime back in June2006, there was a blog post belonging to WinFS team trying to say that WinFS is not dead but just transformed into a better and richer form by moving many of its logic to SQLServer and some other support libraries. Again even thou WinFS would have used some form of database and intelligence above it to achieve what it aimed to achieve, Lack of that intelligence in Win Vista can no way be justified as good And many people did rigthly point it out to MS. I had posted my feedback to WinFS team then asking them not to abonden the concept of WinFS. Here it follows now:

Hi WinFS team,

I use and experiment on many OSs.Even thou I have been using Linux as my main OS for a long time; becuase of my interests I also have windows and do keep track of it. I have tried winVista, agreed it is fancy looking (but bit slow, but at same time its still beta, so hope speed increases by final release) but currently don’t see it adding much value than a better access check mechanism for operations and a updated Graphics driver model which I believe is more XP SP3 material rather than the radical change that WinFS would have been.

So hope you people keep WinFS going rather than abondening it.

Later I had one more post clarifying some thoughts above, it follows:
Just to clarify what I meant above when I told keep WinFS going. Agreed the possibilities around various applications of WinFS could be got by having a flexible attribute mechanism at the filesystem level and intelligent indexing and or storing of all datas (content data as well as all meta data).

At one level it requires the WinFS core services to be available at one end and at the other level significant effort requires to be put in the applications so that they are woven around this core. And the system wide effort is what can give the power of this to all.

So I hope microsoft moves towards this system wide coharence (implicitly or explicitly achived) in simplifying and powering up things for the end user, which most OS environments will move towards slowly.

Open sourcing Code (which should be ethically open) equally or more important than Hiring few open source developers or supporting opensource

Filed under: life,OpenSource,technology — hanishkvc @ 1:15 pm

In Aug2006 there was a blog entry about Google , the god father of opensource ? in linuxjournal. Which tried to justify Google not opensourcing their code by saying that they hire people who work on opensource products and that people won’t understand googles code contributions so why should they and so on. This really made me sad that people think in such ways and make bad things appear good. So I commented there my thoughts on why its not good. I am adding it here as part of consolidation of my thought. Here follows my comments then:

Hi,

It is good to know that google is hiring open source developers so that they can concentrate on their open source work rather than worrying about how to earn a living in parallel to working on open source projects.

However if google is not open sourcing some of its code which it should have open sourced from a pure ethical point of view (but not doing it by hiding behind some short comings in the existing GPL or other open source licenses – feel/guessing that it may be mostly related to web based servicing …) Then it is a bad thing that google is doing and it can no way be justified to be ok just because they hire few open source developers or support few open source projects moneyterily.

If someone tells that they don’t think that its worth opensourcing their code, because people may not understand the code or may not have any use for the code then they are talking garbage here. If all the initial developers of opensource code had worked with the same mentality as above then the open source movement wouldn’t have been the great movement to be reckoned with that it is today.

Yes there might be people who are mentored slowly into open source projects, but at the same time you will find a lot of people silently contributing or using opensource code/project with out having any mentor to guide them because they have some circumstances which they feel is best resolved using open source projects and then learning the abc’s of the project on their own based on the code available to them and by experimenting with that code.

NOTES: As I don’t keep track of events on the opensource front actively I don’t know if google is guilty or not. But if any company (google or otherwise) as a attitude that what ever code they are working on which in turn is directly or indirectly built on open source projects, is not worth opensourcing just because they feel others may not understand it or may not have use for it, then this is NOT a GOOD TREND NOR ATTITUDE and NOR IS IT ETHICAL. And no one should praise such a company and justify that a better thing for such a company to do is (a) to hire few open source developers to let them work on their open source projects, or (b) contribute moneyterily to open source projects or (c) mobilize people to work on opensource projects. What I mean is even though (a),(b) and (c) above are in itself good things it can in NO way justify the stealing (if I may use such harsh word) of efforts of other opensource developers however small it might be. Because it goes against the fundamentals of the open source movement, which are essential to keep the opensource movement alive.

Next Page »

Create a free website or blog at WordPress.com.