Search Results

Search found 11067 results on 443 pages for 'generic collection'.

Page 71/443 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • Java. Best procedure to de-serialize a Java generic object?

    - by Jake
    What is the best procedure for storing and retrieving, using native Java serialization, generic objects like ArrayList<String>? Edit: To clarify. When I serialize an object of type ArrayList<String> I'd like to de-serialize to the same type of object. However, I know of no way to cast back to this generic object without causing warnings.

    Read the article

  • How can I manipulate a VB6 Collection in .NET?

    - by jhominal
    Hello all, I am currently in the process of designing an interface for .NET software that would be consumed by COM objects - specifically, VB6. While I have found a number of pages by Microsoft detailing how to make an COM-interoperable interface, I am currently tripping over the use of Collections in design time: I would like to be able to use a standard VB6 "Collection object" in the .NET program - for example, specify an argument as being a VB6 collection - and thus minimize the time necessary for clients to consume the interface. Thank you in advance.

    Read the article

  • Is there a way to organize a icon collection to allow for easy searching?

    - by John M
    Is there any way of organizing a icon collection so that it easier to find needed icons? For example: the program needs a save icon there are 5 icons collections on your HD that have a save icon and there are 5 more collections that don't have a save icon (but you don't know that) do you browse through each icon collection? run a search (assumes files are named consistently)? Would it be ideal to have some sort of organized directory (printable?)?

    Read the article

  • Can't make AWUS036H work in Ubuntu 12.10

    - by sfrj
    I am using 64 bit Ubuntu 12.10. This is my kernel version: Linux 3.5.0-19-generic #30-Ubuntu SMP Tue Nov 13 17:48:01 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux My wireless card is an AWUSU36H The first thing I do to install the driver is copy the driver from the CD to the Downloads folder. cd /media/me/AWUS036H/Drivers/RTL8187L/Unix (Linux)/Linux driver for kernel 2.6.X$ cp rtl8187_linux_26.1025.0328.2007.tar.gz ~/Downloads/ Then I extract the tar tar xvfz rtl8187_linux_26.1025.0328.2007.tar.gz I navigate into the extracted folder, and I try to follow the instructions in the Readme.txt cd rtl8187_linux_26.1025.0328.2007 This are the contents of the folder: drv.tar.gz makedrv stack.tar.gz wlan0rmv ieee80211 ReadMe.txt wlan0dhcp wlan0up ifcfg-wlan0 rtl8187 wlan0down wpa_supplicant-0.4.9 This is what the Readme.txt says: Release Date: 2006-02-09, ver 1.2^M RTL8187 Linux driver version 1.2^M ^M --This driver supports RealTek RTL8187 Wireless LAN driver for ^M Fedora Core 2/3/4/5, Debian 3.1, Mandrake 10.2/Mandriva 2006, ^M SUSE 9.3/10.1/10.2, Gentoo 3.1, etc.^M - Support Client mode for either infrastructure or adhoc mode^M - Support WEP and WPAPSK connection^M ^M < Component >^M The driver is composed of several parts:^M 1. Module source code^M stack.tar.gz^M drv.tar.gz^M ^M 2. Script ot build the modules^M makedrv^M ^M 3. Script to load/unload modules^M wlan0up^M wlan0down ^M ^M 4. Script and configuration for DHCP^M "ReadMe.txt" [readonly] 140 lines, 4590 characters So what I do know is extract both of the compressed files: sudo tar xvfz drv.tar.gz sudo tar xvfz stack.tar.gz This 2 commands will add some data to the folders ieee80211 and rtl8187 At this point I get lost, and I don't know what to do. If I go in each of this 2 folders and I run the sudo make command then I get errors like this one: sudo makemake -C /lib/modules/3.5.0-19-generic/build M=/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187 modules make[1]: Entering directory `/usr/src/linux-headers-3.5.0-19-generic' CC [M] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187_core.o In file included from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187_core.c:64:0: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187.h:29:26: fatal error: linux/config.h: No such file or directory compilation terminated. make[2]: *** [/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187_core.o] Error 1 make[1]: *** [_module_/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.5.0-19-generic' make: *** [modules] Error 2 If I try to run any of the script ./makedrv that the instructions describe, then I also get an error: ~/Downloads/rtl8187_linux_26.1025.0328.2007$ sudo ./makedrv [sudo] password for me: ieee80211/ ieee80211/license ieee80211/ieee80211_crypt.c ieee80211/ieee80211_tx.c ieee80211/ieee80211_softmac.c ieee80211/ieee80211_softmac_wx.c ieee80211/ieee80211_module.c ieee80211/ieee80211_crypt_ccmp.c ieee80211/ieee80211_rx.c ieee80211/tags ieee80211/ieee80211_crypt_tkip.c ieee80211/Makefile ieee80211/readme ieee80211/.tmp_versions/ ieee80211/.tmp_versions/ieee80211-rtl.mod ieee80211/.tmp_versions/ieee80211_crypt_wep-rtl.mod ieee80211/.tmp_versions/ieee80211_crypt_tkip-rtl.mod ieee80211/.tmp_versions/ieee80211_crypt-rtl.mod ieee80211/.tmp_versions/ieee80211_crypt_ccmp-rtl.mod ieee80211/ieee80211_crypt_wep.c ieee80211/ieee80211.h ieee80211/ieee80211_wx.c ieee80211/ieee80211_crypt.h rtl8187/ rtl8187/license rtl8187/r8180_rtl8225z2.c rtl8187/r8180_rtl8225.h rtl8187/r8187_led.c rtl8187/r8180_93cx6.h rtl8187/r8180_wx.h rtl8187/r8180_hw.h rtl8187/copying rtl8187/r8187_led.h rtl8187/r8180_pm.h rtl8187/tags rtl8187/r8187.h rtl8187/Makefile rtl8187/r8180_rtl8225.c rtl8187/readme rtl8187/install rtl8187/.tmp_versions/ rtl8187/.tmp_versions/r8187.mod rtl8187/changes rtl8187/r8180_wx.c rtl8187/r8180_pm.c rtl8187/r8187_core.c rtl8187/r8180_93cx6.c rtl8187/authors rtl8187/ieee80211.h rtl8187/ieee80211_crypt.h rm -f *.mod.c *.mod *.o .*.cmd *.ko *~ rm -rf /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/tmp make -C /lib/modules/3.5.0-19-generic/build M=/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211 modules make[1]: Entering directory `/usr/src/linux-headers-3.5.0-19-generic' CC [M] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.o In file included from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17:0: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:1019:24: error: field ‘ps_task’ has incomplete type /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_softmac_scan_wq’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:421:2: warning: passing argument 2 of ‘queue_delayed_work’ from incompatible pointer type [enabled by default] In file included from include/linux/srcu.h:32:0, from include/linux/notifier.h:15, from /usr/src/linux-headers-3.5.0-19-generic/arch/x86/include/asm/uprobes.h:26, from include/linux/uprobes.h:35, from include/linux/mm_types.h:15, from include/linux/kmemcheck.h:4, from include/linux/skbuff.h:18, from include/linux/if_ether.h:134, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:26, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17: include/linux/workqueue.h:371:12: note: expected ‘struct delayed_work *’ but argument is of type ‘struct work_struct *’ /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_softmac_stop_scan’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:495:3: warning: passing argument 1 of ‘cancel_delayed_work’ from incompatible pointer type [enabled by default] In file included from include/linux/srcu.h:32:0, from include/linux/notifier.h:15, from /usr/src/linux-headers-3.5.0-19-generic/arch/x86/include/asm/uprobes.h:26, from include/linux/uprobes.h:35, from include/linux/mm_types.h:15, from include/linux/kmemcheck.h:4, from include/linux/skbuff.h:18, from include/linux/if_ether.h:134, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:26, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17: include/linux/workqueue.h:410:20: note: expected ‘struct delayed_work *’ but argument is of type ‘struct work_struct *’ /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_associate_abort’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:915:2: warning: passing argument 2 of ‘queue_delayed_work’ from incompatible pointer type [enabled by default] In file included from include/linux/srcu.h:32:0, from include/linux/notifier.h:15, from /usr/src/linux-headers-3.5.0-19-generic/arch/x86/include/asm/uprobes.h:26, from include/linux/uprobes.h:35, from include/linux/mm_types.h:15, from include/linux/kmemcheck.h:4, from include/linux/skbuff.h:18, from include/linux/if_ether.h:134, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:26, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17: include/linux/workqueue.h:371:12: note: expected ‘struct delayed_work *’ but argument is of type ‘struct work_struct *’ /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_rx_frame_softmac’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:1527:3: error: implicit declaration of function ‘tasklet_schedule’ [-Werror=implicit-function-declaration] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_stop_protocol_rtl’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2120:2: warning: passing argument 1 of ‘cancel_delayed_work’ from incompatible pointer type [enabled by default] In file included from include/linux/srcu.h:32:0, from include/linux/notifier.h:15, from /usr/src/linux-headers-3.5.0-19-generic/arch/x86/include/asm/uprobes.h:26, from include/linux/uprobes.h:35, from include/linux/mm_types.h:15, from include/linux/kmemcheck.h:4, from include/linux/skbuff.h:18, from include/linux/if_ether.h:134, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:26, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17: include/linux/workqueue.h:410:20: note: expected ‘struct delayed_work *’ but argument is of type ‘struct work_struct *’ /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_softmac_init’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2229:78: error: macro "INIT_WORK" passed 3 arguments, but takes just 2 /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2229:2: error: ‘INIT_WORK’ undeclared (first use in this function) /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2229:2: note: each undeclared identifier is reported only once for each function it appears in /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2230:88: error: macro "INIT_WORK" passed 3 arguments, but takes just 2 /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2231:94: error: macro "INIT_WORK" passed 3 arguments, but takes just 2 /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2232:96: error: macro "INIT_WORK" passed 3 arguments, but takes just 2 /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2233:82: error: macro "INIT_WORK" passed 3 arguments, but takes just 2 /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2234:82: error: macro "INIT_WORK" passed 3 arguments, but takes just 2 /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2244:2: error: implicit declaration of function ‘tasklet_init’ [-Werror=implicit-function-declaration] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_softmac_free’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2255:2: warning: passing argument 1 of ‘cancel_delayed_work’ from incompatible pointer type [enabled by default] In file included from include/linux/srcu.h:32:0, from include/linux/notifier.h:15, from /usr/src/linux-headers-3.5.0-19-generic/arch/x86/include/asm/uprobes.h:26, from include/linux/uprobes.h:35, from include/linux/mm_types.h:15, from include/linux/kmemcheck.h:4, from include/linux/skbuff.h:18, from include/linux/if_ether.h:134, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:26, from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17: include/linux/workqueue.h:410:20: note: expected ‘struct delayed_work *’ but argument is of type ‘struct work_struct *’ /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: In function ‘ieee80211_wpa_set_encryption’: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2489:3: error: implicit declaration of function ‘request_module’ [-Werror=implicit-function-declaration] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2518:3: error: implicit declaration of function ‘try_module_get’ [-Werror=implicit-function-declaration] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c: At top level: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2663:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2663:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2663:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2664:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2664:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2664:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2665:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2665:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2665:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2666:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2666:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2666:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2667:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2667:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2667:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2668:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2668:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2668:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2669:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2669:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2669:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2670:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2670:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2670:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2671:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2671:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2671:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2672:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2672:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2672:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2673:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2673:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2673:1: warning: parameter names (without types) in function declaration [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2674:1: warning: data definition has no type or storage class [enabled by default] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2674:1: warning: type defaults to ‘int’ in declaration of ‘EXPORT_SYMBOL’ [-Wimplicit-int] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:2674:1: warning: parameter names (without types) in function declaration [enabled by default] In file included from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.c:17:0: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211.h:1212:37: warning: ‘netdev_priv’ is static but used in inline function ‘ieee80211_priv’ which is not static [enabled by default] cc1: some warnings being treated as errors make[2]: *** [/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211/ieee80211_softmac.o] Error 1 make[1]: *** [_module_/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/ieee80211] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.5.0-19-generic' make: *** [modules] Error 2 rm -f *.mod.c *.mod *.o .*.cmd *.ko *~ rm -rf /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/tmp make -C /lib/modules/3.5.0-19-generic/build M=/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187 modules make[1]: Entering directory `/usr/src/linux-headers-3.5.0-19-generic' CC [M] /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187_core.o In file included from /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187_core.c:64:0: /home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187.h:29:26: fatal error: linux/config.h: No such file or directory compilation terminated. make[2]: *** [/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187/r8187_core.o] Error 1 make[1]: *** [_module_/home/me/Downloads/rtl8187_linux_26.1025.0328.2007/rtl8187] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.5.0-19-generic' make: *** [modules] Error 2 Can somebody give me a hand finding out what I need to do to make my wifi card work? Update This is the output of the lsusb command lsusb Bus 003 Device 002: ID 147e:1000 Upek Biometric Touchchip/Touchstrip Fingerprint Sensor Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub

    Read the article

  • Trojan Horse Generic.15.apnz Virus Impossible to get rid of?!

    - by DaveDev
    Hi Guys I'm new to this forum so I'm unsure whether this is the right place for a question like this. I have a Trojan Horse infection that lives in memory and seems to be impossible to get rid of. I've tried a vew AntiVirus products (Norton, Windows Essentials & AVG Free) all to no avail & I've recently tried a few bootable antivirus solutions I found here: http://www.techmixer.com/free-bootable-antivirus-rescue-cds-download-list/ Kaspersky Rescue Disk 2008 (failed - it wouldn't even load the UI, is there a newer one out there?) F Secure Rescue Disk (Updates & Scanner ran, found 10 infections, said it was going to delete or repair them, but didn't get rid of them) Avira found a load of infections but it froze when I tried to interact with the UI after the scan. Every time I run these I'll load Windows after & run AVG, and it still finds Trojan Horse Generic.15.apnz (in Services.exe) & Trojan Horse Generic.16.ARSU (in svchost.exe) Is anyone familiar with a virus like this? Can anyone suggest how to approach a solution? Thanks Dave

    Read the article

  • How do you monitor SSD wear in Windows when the drives are presented as 'generic' devices?

    - by MikeyB
    Under Linux, we can monitor SSD wear fairly easily with smartmontools whether the drive is presented as a normal block device or a generic device (which happens when the drive has been hardware RAIDed by certain controllers such as the one on the IBM HS22). How can we do the equivalent under Windows? Does anyone actually use smartmontools? Or are there other packages out there? The problem is that SCSI Generic devices just don't show up in Windows. If the drives aren't RAIDed we can see them fine. How I'd do it in Linux: sles11-live:~ # lsscsi -g [1:0:0:0] disk SMART USB-IBM 8989 /dev/sda /dev/sg0 [2:0:0:0] disk ATA MTFDDAK256MAR-1K MA44 - /dev/sg1 [2:0:1:0] disk ATA MTFDDAK256MAR-1K MA44 - /dev/sg2 [2:1:8:0] disk LSILOGIC Logical Volume 3000 /dev/sdb /dev/sg3 sles11-live:~ # smartctl -l ssd /dev/sg1 smartctl 5.42 2011-10-20 r3458 [x86_64-linux-2.6.32.49-0.3-default] (local build) Copyright (C) 2002-11 by Bruce Allen, http://smartmontools.sourceforge.net Device Statistics (GP Log 0x04) Page Offset Size Value Description 7 ===== = = == Solid State Device Statistics (rev 1) == 7 0x008 1 26~ Percentage Used Endurance Indicator |_ ~ normalized value sles11-live:~ # smartctl -l ssd /dev/sg2 smartctl 5.42 2011-10-20 r3458 [x86_64-linux-2.6.32.49-0.3-default] (local build) Copyright (C) 2002-11 by Bruce Allen, http://smartmontools.sourceforge.net Device Statistics (GP Log 0x04) Page Offset Size Value Description 7 ===== = = == Solid State Device Statistics (rev 1) == 7 0x008 1 3~ Percentage Used Endurance Indicator |_ ~ normalized value

    Read the article

  • How to use Zune software to listen to podcasts with generic MP3 player?

    - by Bevan
    I listen to a bunch of podcasts - a great way to fill the otherwise mindless space of a daily commute. My MP3 player is a Transonic brand, appears on my computer as a generic storage device. I've been using iTunes to download the podcasts, and manually moving the files out of the disk folder onto my player, but this is pretty tedious. iTunes also fails to recognise that the files are gone and leaves them in the list. (Actually, iTunes for windows is pretty much a dog, but that's a different rant.) The Zune software is 99% of what I want in a podcast downloader - performs well, looks nice, downloads reliably and so on. Some features - like only downloading the next five unheard episodes of a podcast - are superb. However, if I manually move the files across to the MP3 player, the Zune software concludes that the file has never been downloaded, and downloads it again. This leads me to my question: What is a good way to use the Zune software to download podcasts for listening on a generic MP3 player? Are there any addons for the Zune software to make this easier? Registry hacks? Can I configure the Zune software to not download the same episode multiple times? Is there a way for the Zune software to populate my MP3 player directly, instead of having to copy files?

    Read the article

  • Integrate SharePoint 2010 with Team Foundation Server 2010

    - by Martin Hinshelwood
    Our client is using a brand new shiny installation of SharePoint 2010, so we need to integrate our upgraded Team Foundation Server 2010 instance into it. In order to do that you need to run the Team Foundation Server 2010 install on the SharePoint 2010 server and choose to install only the “Extensions for SharePoint Products and Technologies”. We want out upgraded Team Project Collection to create any new portal in this SharePoint 2010 server farm. There a number of goodies above and beyond a solution file that requires the install, with the main one being the TFS2010 client API. These goodies allow proper integration with the creation and viewing of Work Items from SharePoint a new feature with TFS 2010. This works in both SharePoint 2007 and SharePoint 2010 with the level of integration dependant on the version of SharePoint that you are running. There are three levels of integration with “SharePoint Services 3.0” or “SharePoint Foundation 2010” being the lowest. This level only offers reporting services framed integration for reporting along with Work Item Integration and document management. The highest is Microsoft Office SharePoint Services (MOSS) Enterprise with Excel Services integration providing some lovely dashboards. Figure: Dashboards take the guessing out of Project Planning and estimation. Plus writing these reports would be boring!   The Extensions that you need are on the same installation media as the main TFS install and the only difference is the options you pick during the install. Figure: Installing the TFS 2010 Extensions for SharePoint Products and Technologies onto SharePoint 2010   Annoyingly you may need to reboot a couple of times, but on this server the process was MUCH smother than on our internal server. I think this was mostly to do with this being a clean install. Once it is installed you need to run the configuration. This will add all of the Solution and Templates that are needed for SharePoint to work properly with TFS. Figure: This is where all the TFS 2010 goodies are added to your SharePoint 2010 server and the TFS 2010 object model is installed.   Figure: All done, you have everything installed, but you still need to configure it Now that we have the TFS 2010 SharePoint Extensions installed on our SharePoint 2010 server we need to configure them both so that they will talk happily to each other. Configuring the SharePoint 2010 Managed path for Team Foundation Server 2010 In order for TFS to automatically create your project portals you need a wildcard managed path setup. This is where TFS will create the portal during the creation of a new Team project. To find the managed paths page for any application you need to first select the “Managed web applications”  link from the SharePoint 2010 Central Administration screen. Figure: Find the “Manage web applications” link under the “Application Management” section. On you are there you will see that the “Managed Paths” are there, they are just greyed out and selecting one of the applications will enable it to be clicked. Figure: You need to select an application for the SharePoint 2010 ribbon to activate.   Figure: You need to select an application before you can get to the Managed Paths for that application. Now we need to add a managed path for TFS 2010 to create its portals under. I have gone for the obvious option of just calling the managed path “TFS02” as the TFS 2010 server is the second TFS server that the client has installed, TFS 2008 being the first. This links the location to the server name, and as you can’t have two projects of the same name in two separate project collections there is unlikely to be any conflicts. Figure: Add a “tfs02” wildcard inclusion path to your SharePoint site. Configure the Team Foundation Server 2010 connection to SharePoint 2010 In order to have you new TFS 2010 Server talk to and create sites in SharePoint 2010 you need to tell the TFS server where to put them. As this TFS 2010 server was installed in out-of-the-box mode it has a SharePoint Services 3.0 (the free one) server running on the same box. But we want to change that so we can use the external SharePoint 2010 instance. Just open the “Team Foundation Server Administration Console” and navigate to the “SharePoint Web Applications” section. Here you click “Add” and enter the details for the Managed path we just created. Figure: If you have special permissions on your SharePoint you may need to add accounts to the “Service Accounts” section.    Before we can se this new SharePoint 2010 instance to be the default for our upgraded Team Project Collection we need to configure SharePoint to take instructions from our TFS server. Configure SharePoint 2010 to connect to Team Foundation Server 2010 On your SharePoint 2010 server open the Team Foundation Server Administration Console and select the “Extensions for SharePoint Products and Technologies” node. Here we need to “grant access” for our TFS 2010 server to create sites. Click the “Grant access” link and  fill out the full URL to the  TFS server, for example http://servername.domain.com:8080/tfs, and if need be restrict the path that TFS sites can be created on. Remember that when the users create a new team project they can change the default and point it anywhere they like as long as it is an authorised SharePoint location. Figure: Grant access for your TFS 2010 server to create sites in SharePoint 2010 Now that we have an authorised location for our team project portals to be created we need to tell our Team Project Collection that this is where it should stick sites by default for any new Team Projects created. Configure the Team Foundation Server 2010 Team Project Collection to create new sites in SharePoint 2010 Back on out TFS 2010 server we need to setup the defaults for our upgraded Team Project Collection to the new SharePoint 2010 integration we have just set up. On the TFS 2010 server open up the “Team Foundation Server Administration Console” again and navigate to the “Team Project Collections” node. Once you are there you will see a list of all of your TPC’s and in our case we have a DefaultCollection as well as out named and Upgraded collection for TFS 2008. If you select the “SharePoint Site” tab we can see that it is not currently configured. Figure: Our new Upgrade TFS2008 Team Project Collection does not have SharePoint configured Select to “Edit Default Site Location” and select the new integration point that we just set up for SharePoint 2010. Once you have selected the “SharePoint Web Application” (the thing we just configured) then it will give you an example based on that configuration point and the name of the Team Project Collection that we are configuring. Figure: Set the default location for new Team Project Portals to be created for this Team Project Collection This is where the reason for configuring the Extensions on the SharePoint 2010 server before doing this last bit becomes apparent. TFS 2010 is going to create a site at our http://sharepointserver/tfs02/ location called http://sharepointserver/tfs02/[TeamProjectCollection], or whatever we had specified, and it would have had difficulty doing this if we had not given it permission first. Figure: If there is no Team Project Collection site at this location the TFS 2010 server is going to create one This will create a nice Team Project Collection parent site to contain the Portals for any new Team Projects that are created. It is with noting that it will not create portals for existing Team Projects as this process is run during the Team Project Creation wizard. Figure: Just a basic parent site to host all of your new Team Project Portals as sub sites   You will need to add all of the users that will be creating Team Projects to be Administrators of this site so that they will not get an error during the Project Creation Wizard. You may also want to customise this as a proper portal to your projects if you are going to be having lots of them, but it is really just a default placeholder so you have a top level site that you can backup and point at. You have now integrated SharePoint 2010 and team Foundation Server 2010! You can now go forth and multiple your Team Projects for this Team Project Collection or you can continue to add portals to your other Collections.   Technorati Tags: TFS 2010,Sharepoint 2010,VS ALM

    Read the article

  • AutoMapper MappingFunction from Source Type of NameValueCollection

    - by REA_ANDREW
    I have had a situation arise today where I need to construct a complex type from a source of a NameValueCollection.  A little while back I submitted a patch for the Agatha Project to include REST (JSON and XML) support for the service contract.  I realized today that as useful as it is, it did not actually support true REST conformance, as REST should support GET so that you can use JSONP from JavaScript directly meaning you can query cross domain services.  My original implementation for POX and JSON used the POST method and this immediately rules out JSONP as from reading, JSONP only works with GET Requests. This then raised another issue.  The current operation contract of Agatha and one of its main benefits is that you can supply an array of Request objects in a single request, limiting the about of server requests you need to make.  Now, at the present time I am thinking that this will not be the case for the REST imlementation but will yield the benefits of the fact that : The same Request objects can be used for SOAP and RST (POX, JSON) The construct of the JavaScript functions will be simpler and more readable It will enable the use of JSONP for cross domain REST Services The current contract for the Agatha WcfRequestProcessor is at time of writing the following: [ServiceContract] public interface IWcfRequestProcessor { [OperationContract(Name = "ProcessRequests")] [ServiceKnownType("GetKnownTypes", typeof(KnownTypeProvider))] [TransactionFlow(TransactionFlowOption.Allowed)] Response[] Process(params Request[] requests); [OperationContract(Name = "ProcessOneWayRequests", IsOneWay = true)] [ServiceKnownType("GetKnownTypes", typeof(KnownTypeProvider))] void ProcessOneWayRequests(params OneWayRequest[] requests); }   My current proposed solution, and at the very early stages of my concept is as follows: [ServiceContract] public interface IWcfRestJsonRequestProcessor { [OperationContract(Name="process")] [ServiceKnownType("GetKnownTypes", typeof(KnownTypeProvider))] [TransactionFlow(TransactionFlowOption.Allowed)] [WebGet(UriTemplate = "process/{name}/{*parameters}", BodyStyle = WebMessageBodyStyle.WrappedResponse, ResponseFormat = WebMessageFormat.Json)] Response[] Process(string name, NameValueCollection parameters); [OperationContract(Name="processoneway",IsOneWay = true)] [ServiceKnownType("GetKnownTypes", typeof(KnownTypeProvider))] [WebGet(UriTemplate = "process-one-way/{name}/{*parameters}", BodyStyle = WebMessageBodyStyle.WrappedResponse, ResponseFormat = WebMessageFormat.Json)] void ProcessOneWayRequests(string name, NameValueCollection parameters); }   Now this part I have not yet implemented, it is the preliminart step which I have developed which will allow me to take the name of the Request Type and the NameValueCollection and construct the complex type which is that of the Request which I can then supply to a nested instance of the original IWcfRequestProcessor  and work as it should normally.  To give an example of some of the urls which you I envisage with this method are: http://www.url.com/service.svc/json/process/getweather/?location=london http://www.url.com/service.svc/json/process/getproductsbycategory/?categoryid=1 http://www.url.om/service.svc/json/process/sayhello/?name=andy Another reason why my direction has gone to a single request for the REST implementation is because of restrictions which are imposed by browsers on the length of the url.  From what I have read this is on average 2000 characters.  I think that this is a very acceptable usage limit in the context of using 1 request, but I do not think this is acceptable for accommodating multiple requests chained together.  I would love to be corrected on that one, I really would but unfortunately from what I have read I have come to the conclusion that this is not the case. The mapping function So, as I say this is just the first pass I have made at this, and I am not overly happy with the try catch for detecting types without default constructors.  I know there is a better way but for the minute, it escapes me.  I would also like to know the correct way for adding mapping functions and not using the anonymous way that I have used.  To achieve this I have used recursion which I am sure is what other mapping function use. As you do have to go as deep as the complex type is. public static object RecurseType(NameValueCollection collection, Type type, string prefix) { try { var returnObject = Activator.CreateInstance(type); foreach (var property in type.GetProperties()) { foreach (var key in collection.AllKeys) { if (String.IsNullOrEmpty(prefix) || key.Length > prefix.Length) { var propertyNameToMatch = String.IsNullOrEmpty(prefix) ? key : key.Substring(property.Name.IndexOf(prefix) + prefix.Length + 1); if (property.Name == propertyNameToMatch) { property.SetValue(returnObject, Convert.ChangeType(collection.Get(key), property.PropertyType), null); } else if(property.GetValue(returnObject,null) == null) { property.SetValue(returnObject, RecurseType(collection, property.PropertyType, String.Concat(prefix, property.PropertyType.Name)), null); } } } } return returnObject; } catch (MissingMethodException) { //Quite a blunt way of dealing with Types without default constructor return null; } }   Another thing is performance, I have not measured this in anyway, it is as I say the first pass, so I hope this can be the start of a more perfected implementation.  I tested this out with a complex type of three levels, there is no intended logical meaning to the properties, they are simply for the purposes of example.  You could call this a spiking session, as from here on in, now I know what I am building I would take a more TDD approach.  OK, purists, why did I not do this from the start, well I didn’t, this was a brain dump and now I know what I am building I can. The console test and how I used with AutoMapper is as follows: static void Main(string[] args) { var collection = new NameValueCollection(); collection.Add("Name", "Andrew Rea"); collection.Add("Number", "1"); collection.Add("AddressLine1", "123 Street"); collection.Add("AddressNumber", "2"); collection.Add("AddressPostCodeCountry", "United Kingdom"); collection.Add("AddressPostCodeNumber", "3"); AutoMapper.Mapper.CreateMap<NameValueCollection, Person>() .ConvertUsing(x => { return(Person) RecurseType(x, typeof(Person), null); }); var person = AutoMapper.Mapper.Map<NameValueCollection, Person>(collection); Console.WriteLine(person.Name); Console.WriteLine(person.Number); Console.WriteLine(person.Address.Line1); Console.WriteLine(person.Address.Number); Console.WriteLine(person.Address.PostCode.Country); Console.WriteLine(person.Address.PostCode.Number); Console.ReadLine(); }   Notice the convention that I am using and that this method requires you do use.  Each property is prefixed with the constructed name of its parents combined.  This is the convention used by AutoMapper and it makes sense. I can also think of other uses for this including using with ASP.NET MVC ModelBinders for creating a complex type from the QueryString which is itself is a NameValueCollection. Hope this is of some help to people and I would welcome any code reviews you could give me. References: Agatha : http://code.google.com/p/agatha-rrsl/ AutoMapper : http://automapper.codeplex.com/   Cheers for now, Andrew   P.S. I will have the proposed solution for a more complete REST implementation for AGATHA very soon. 

    Read the article

  • Imperative Programming v/s Declarative Programming v/s Functional Programming

    - by kaleidoscope
    Imperative Programming :: Imperative programming is a programming paradigm that describes computation in terms of statements that change a program state. In much the same way as the imperative mood in natural languages expresses commands to take action, imperative programs define sequences of commands for the computer to perform. The focus is on what steps the computer should take rather than what the computer will do (ex. C, C++, Java). Declarative Programming :: Declarative programming is a programming paradigm that expresses the logic of a computation without describing its control flow. It attempts to minimize or eliminate side effects by describing what the program should accomplish, rather than describing how to go about accomplishing it. The focus is on what the computer should do rather than how it should do it (ex. SQL). A  C# example of declarative v/s. imperative programming is LINQ. With imperative programming, you tell the compiler what you want to happen, step by step. For example, let's start with this collection, and choose the odd numbers: List<int> collection = new List<int> { 1, 2, 3, 4, 5 }; With imperative programming, we'd step through this, and decide what we want: List<int> results = new List<int>(); foreach(var num in collection) {     if (num % 2 != 0)           results.Add(num); } Here’s what we are doing: *Create a result collection *Step through each number in the collection *Check the number, if it's odd, add it to the results With declarative programming, on the other hand, we write the code that describes what you want, but not necessarily how to get it var results = collection.Where( num => num % 2 != 0); Here, we're saying "Give us everything where it's odd", not "Step through the collection. Check this item, if it's odd, add it to a result collection." Functional Programming :: Functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids state and mutable data. It emphasizes the application of functions.Functional programming has its roots in the lambda calculus. It is a subset of declarative languages that has heavy focus on recursion. Functional programming can be a mind-bender, which is one reason why Lisp, Scheme, and Haskell have never really surpassed C, C++, Java and COBOL in commercial popularity. But there are benefits to the functional way. For one, if you can get the logic correct, functional programming requires orders of magnitude less code than imperative programming. That means fewer points of failure, less code to test, and a more productive (and, many would say, happier) programming life. As systems get bigger, this has become more and more important. To know more : http://stackoverflow.com/questions/602444/what-is-functional-declarative-and-imperative-programming http://msdn.microsoft.com/en-us/library/bb669144.aspx http://en.wikipedia.org/wiki/Imperative_programming   Technorati Tags: Ranjit,Imperative Programming,Declarative programming,Functional Programming

    Read the article

  • Understanding G1 GC Logs

    - by poonam
    The purpose of this post is to explain the meaning of GC logs generated with some tracing and diagnostic options for G1 GC. We will take a look at the output generated with PrintGCDetails which is a product flag and provides the most detailed level of information. Along with that, we will also look at the output of two diagnostic flags that get enabled with -XX:+UnlockDiagnosticVMOptions option - G1PrintRegionLivenessInfo that prints the occupancy and the amount of space used by live objects in each region at the end of the marking cycle and G1PrintHeapRegions that provides detailed information on the heap regions being allocated and reclaimed. We will be looking at the logs generated with JDK 1.7.0_04 using these options. Option -XX:+PrintGCDetails Here's a sample log of G1 collection generated with PrintGCDetails. 0.522: [GC pause (young), 0.15877971 secs] [Parallel Time: 157.1 ms] [GC Worker Start (ms): 522.1 522.2 522.2 522.2 Avg: 522.2, Min: 522.1, Max: 522.2, Diff: 0.1] [Ext Root Scanning (ms): 1.6 1.5 1.6 1.9 Avg: 1.7, Min: 1.5, Max: 1.9, Diff: 0.4] [Update RS (ms): 38.7 38.8 50.6 37.3 Avg: 41.3, Min: 37.3, Max: 50.6, Diff: 13.3] [Processed Buffers : 2 2 3 2 Sum: 9, Avg: 2, Min: 2, Max: 3, Diff: 1] [Scan RS (ms): 9.9 9.7 0.0 9.7 Avg: 7.3, Min: 0.0, Max: 9.9, Diff: 9.9] [Object Copy (ms): 106.7 106.8 104.6 107.9 Avg: 106.5, Min: 104.6, Max: 107.9, Diff: 3.3] [Termination (ms): 0.0 0.0 0.0 0.0 Avg: 0.0, Min: 0.0, Max: 0.0, Diff: 0.0] [Termination Attempts : 1 4 4 6 Sum: 15, Avg: 3, Min: 1, Max: 6, Diff: 5] [GC Worker End (ms): 679.1 679.1 679.1 679.1 Avg: 679.1, Min: 679.1, Max: 679.1, Diff: 0.1] [GC Worker (ms): 156.9 157.0 156.9 156.9 Avg: 156.9, Min: 156.9, Max: 157.0, Diff: 0.1] [GC Worker Other (ms): 0.3 0.3 0.3 0.3 Avg: 0.3, Min: 0.3, Max: 0.3, Diff: 0.0] [Clear CT: 0.1 ms] [Other: 1.5 ms] [Choose CSet: 0.0 ms] [Ref Proc: 0.3 ms] [Ref Enq: 0.0 ms] [Free CSet: 0.3 ms] [Eden: 12M(12M)->0B(10M) Survivors: 0B->2048K Heap: 13M(64M)->9739K(64M)] [Times: user=0.59 sys=0.02, real=0.16 secs] This is the typical log of an Evacuation Pause (G1 collection) in which live objects are copied from one set of regions (young OR young+old) to another set. It is a stop-the-world activity and all the application threads are stopped at a safepoint during this time. This pause is made up of several sub-tasks indicated by the indentation in the log entries. Here's is the top most line that gets printed for the Evacuation Pause. 0.522: [GC pause (young), 0.15877971 secs] This is the highest level information telling us that it is an Evacuation Pause that started at 0.522 secs from the start of the process, in which all the regions being evacuated are Young i.e. Eden and Survivor regions. This collection took 0.15877971 secs to finish. Evacuation Pauses can be mixed as well. In which case the set of regions selected include all of the young regions as well as some old regions. 1.730: [GC pause (mixed), 0.32714353 secs] Let's take a look at all the sub-tasks performed in this Evacuation Pause. [Parallel Time: 157.1 ms] Parallel Time is the total elapsed time spent by all the parallel GC worker threads. The following lines correspond to the parallel tasks performed by these worker threads in this total parallel time, which in this case is 157.1 ms. [GC Worker Start (ms): 522.1 522.2 522.2 522.2Avg: 522.2, Min: 522.1, Max: 522.2, Diff: 0.1] The first line tells us the start time of each of the worker thread in milliseconds. The start times are ordered with respect to the worker thread ids – thread 0 started at 522.1ms and thread 1 started at 522.2ms from the start of the process. The second line tells the Avg, Min, Max and Diff of the start times of all of the worker threads. [Ext Root Scanning (ms): 1.6 1.5 1.6 1.9 Avg: 1.7, Min: 1.5, Max: 1.9, Diff: 0.4] This gives us the time spent by each worker thread scanning the roots (globals, registers, thread stacks and VM data structures). Here, thread 0 took 1.6ms to perform the root scanning task and thread 1 took 1.5 ms. The second line clearly shows the Avg, Min, Max and Diff of the times spent by all the worker threads. [Update RS (ms): 38.7 38.8 50.6 37.3 Avg: 41.3, Min: 37.3, Max: 50.6, Diff: 13.3] Update RS gives us the time each thread spent in updating the Remembered Sets. Remembered Sets are the data structures that keep track of the references that point into a heap region. Mutator threads keep changing the object graph and thus the references that point into a particular region. We keep track of these changes in buffers called Update Buffers. The Update RS sub-task processes the update buffers that were not able to be processed concurrently, and updates the corresponding remembered sets of all regions. [Processed Buffers : 2 2 3 2Sum: 9, Avg: 2, Min: 2, Max: 3, Diff: 1] This tells us the number of Update Buffers (mentioned above) processed by each worker thread. [Scan RS (ms): 9.9 9.7 0.0 9.7 Avg: 7.3, Min: 0.0, Max: 9.9, Diff: 9.9] These are the times each worker thread had spent in scanning the Remembered Sets. Remembered Set of a region contains cards that correspond to the references pointing into that region. This phase scans those cards looking for the references pointing into all the regions of the collection set. [Object Copy (ms): 106.7 106.8 104.6 107.9 Avg: 106.5, Min: 104.6, Max: 107.9, Diff: 3.3] These are the times spent by each worker thread copying live objects from the regions in the Collection Set to the other regions. [Termination (ms): 0.0 0.0 0.0 0.0 Avg: 0.0, Min: 0.0, Max: 0.0, Diff: 0.0] Termination time is the time spent by the worker thread offering to terminate. But before terminating, it checks the work queues of other threads and if there are still object references in other work queues, it tries to steal object references, and if it succeeds in stealing a reference, it processes that and offers to terminate again. [Termination Attempts : 1 4 4 6 Sum: 15, Avg: 3, Min: 1, Max: 6, Diff: 5] This gives the number of times each thread has offered to terminate. [GC Worker End (ms): 679.1 679.1 679.1 679.1 Avg: 679.1, Min: 679.1, Max: 679.1, Diff: 0.1] These are the times in milliseconds at which each worker thread stopped. [GC Worker (ms): 156.9 157.0 156.9 156.9 Avg: 156.9, Min: 156.9, Max: 157.0, Diff: 0.1] These are the total lifetimes of each worker thread. [GC Worker Other (ms): 0.3 0.3 0.3 0.3Avg: 0.3, Min: 0.3, Max: 0.3, Diff: 0.0] These are the times that each worker thread spent in performing some other tasks that we have not accounted above for the total Parallel Time. [Clear CT: 0.1 ms] This is the time spent in clearing the Card Table. This task is performed in serial mode. [Other: 1.5 ms] Time spent in the some other tasks listed below. The following sub-tasks (which individually may be parallelized) are performed serially. [Choose CSet: 0.0 ms] Time spent in selecting the regions for the Collection Set. [Ref Proc: 0.3 ms] Total time spent in processing Reference objects. [Ref Enq: 0.0 ms] Time spent in enqueuing references to the ReferenceQueues. [Free CSet: 0.3 ms] Time spent in freeing the collection set data structure. [Eden: 12M(12M)->0B(13M) Survivors: 0B->2048K Heap: 14M(64M)->9739K(64M)] This line gives the details on the heap size changes with the Evacuation Pause. This shows that Eden had the occupancy of 12M and its capacity was also 12M before the collection. After the collection, its occupancy got reduced to 0 since everything is evacuated/promoted from Eden during a collection, and its target size grew to 13M. The new Eden capacity of 13M is not reserved at this point. This value is the target size of the Eden. Regions are added to Eden as the demand is made and when the added regions reach to the target size, we start the next collection. Similarly, Survivors had the occupancy of 0 bytes and it grew to 2048K after the collection. The total heap occupancy and capacity was 14M and 64M receptively before the collection and it became 9739K and 64M after the collection. Apart from the evacuation pauses, G1 also performs concurrent-marking to build the live data information of regions. 1.416: [GC pause (young) (initial-mark), 0.62417980 secs] ….... 2.042: [GC concurrent-root-region-scan-start] 2.067: [GC concurrent-root-region-scan-end, 0.0251507] 2.068: [GC concurrent-mark-start] 3.198: [GC concurrent-mark-reset-for-overflow] 4.053: [GC concurrent-mark-end, 1.9849672 sec] 4.055: [GC remark 4.055: [GC ref-proc, 0.0000254 secs], 0.0030184 secs] [Times: user=0.00 sys=0.00, real=0.00 secs] 4.088: [GC cleanup 117M->106M(138M), 0.0015198 secs] [Times: user=0.00 sys=0.00, real=0.00 secs] 4.090: [GC concurrent-cleanup-start] 4.091: [GC concurrent-cleanup-end, 0.0002721] The first phase of a marking cycle is Initial Marking where all the objects directly reachable from the roots are marked and this phase is piggy-backed on a fully young Evacuation Pause. 2.042: [GC concurrent-root-region-scan-start] This marks the start of a concurrent phase that scans the set of root-regions which are directly reachable from the survivors of the initial marking phase. 2.067: [GC concurrent-root-region-scan-end, 0.0251507] End of the concurrent root region scan phase and it lasted for 0.0251507 seconds. 2.068: [GC concurrent-mark-start] Start of the concurrent marking at 2.068 secs from the start of the process. 3.198: [GC concurrent-mark-reset-for-overflow] This indicates that the global marking stack had became full and there was an overflow of the stack. Concurrent marking detected this overflow and had to reset the data structures to start the marking again. 4.053: [GC concurrent-mark-end, 1.9849672 sec] End of the concurrent marking phase and it lasted for 1.9849672 seconds. 4.055: [GC remark 4.055: [GC ref-proc, 0.0000254 secs], 0.0030184 secs] This corresponds to the remark phase which is a stop-the-world phase. It completes the left over marking work (SATB buffers processing) from the previous phase. In this case, this phase took 0.0030184 secs and out of which 0.0000254 secs were spent on Reference processing. 4.088: [GC cleanup 117M->106M(138M), 0.0015198 secs] Cleanup phase which is again a stop-the-world phase. It goes through the marking information of all the regions, computes the live data information of each region, resets the marking data structures and sorts the regions according to their gc-efficiency. In this example, the total heap size is 138M and after the live data counting it was found that the total live data size dropped down from 117M to 106M. 4.090: [GC concurrent-cleanup-start] This concurrent cleanup phase frees up the regions that were found to be empty (didn't contain any live data) during the previous stop-the-world phase. 4.091: [GC concurrent-cleanup-end, 0.0002721] Concurrent cleanup phase took 0.0002721 secs to free up the empty regions. Option -XX:G1PrintRegionLivenessInfo Now, let's look at the output generated with the flag G1PrintRegionLivenessInfo. This is a diagnostic option and gets enabled with -XX:+UnlockDiagnosticVMOptions. G1PrintRegionLivenessInfo prints the live data information of each region during the Cleanup phase of the concurrent-marking cycle. 26.896: [GC cleanup ### PHASE Post-Marking @ 26.896### HEAP committed: 0x02e00000-0x0fe00000 reserved: 0x02e00000-0x12e00000 region-size: 1048576 Cleanup phase of the concurrent-marking cycle started at 26.896 secs from the start of the process and this live data information is being printed after the marking phase. Committed G1 heap ranges from 0x02e00000 to 0x0fe00000 and the total G1 heap reserved by JVM is from 0x02e00000 to 0x12e00000. Each region in the G1 heap is of size 1048576 bytes. ### type address-range used prev-live next-live gc-eff### (bytes) (bytes) (bytes) (bytes/ms) This is the header of the output that tells us about the type of the region, address-range of the region, used space in the region, live bytes in the region with respect to the previous marking cycle, live bytes in the region with respect to the current marking cycle and the GC efficiency of that region. ### FREE 0x02e00000-0x02f00000 0 0 0 0.0 This is a Free region. ### OLD 0x02f00000-0x03000000 1048576 1038592 1038592 0.0 Old region with address-range from 0x02f00000 to 0x03000000. Total used space in the region is 1048576 bytes, live bytes as per the previous marking cycle are 1038592 and live bytes with respect to the current marking cycle are also 1038592. The GC efficiency has been computed as 0. ### EDEN 0x03400000-0x03500000 20992 20992 20992 0.0 This is an Eden region. ### HUMS 0x0ae00000-0x0af00000 1048576 1048576 1048576 0.0### HUMC 0x0af00000-0x0b000000 1048576 1048576 1048576 0.0### HUMC 0x0b000000-0x0b100000 1048576 1048576 1048576 0.0### HUMC 0x0b100000-0x0b200000 1048576 1048576 1048576 0.0### HUMC 0x0b200000-0x0b300000 1048576 1048576 1048576 0.0### HUMC 0x0b300000-0x0b400000 1048576 1048576 1048576 0.0### HUMC 0x0b400000-0x0b500000 1001480 1001480 1001480 0.0 These are the continuous set of regions called Humongous regions for storing a large object. HUMS (Humongous starts) marks the start of the set of humongous regions and HUMC (Humongous continues) tags the subsequent regions of the humongous regions set. ### SURV 0x09300000-0x09400000 16384 16384 16384 0.0 This is a Survivor region. ### SUMMARY capacity: 208.00 MB used: 150.16 MB / 72.19 % prev-live: 149.78 MB / 72.01 % next-live: 142.82 MB / 68.66 % At the end, a summary is printed listing the capacity, the used space and the change in the liveness after the completion of concurrent marking. In this case, G1 heap capacity is 208MB, total used space is 150.16MB which is 72.19% of the total heap size, live data in the previous marking was 149.78MB which was 72.01% of the total heap size and the live data as per the current marking is 142.82MB which is 68.66% of the total heap size. Option -XX:+G1PrintHeapRegions G1PrintHeapRegions option logs the regions related events when regions are committed, allocated into or are reclaimed. COMMIT/UNCOMMIT events G1HR COMMIT [0x6e900000,0x6ea00000]G1HR COMMIT [0x6ea00000,0x6eb00000] Here, the heap is being initialized or expanded and the region (with bottom: 0x6eb00000 and end: 0x6ec00000) is being freshly committed. COMMIT events are always generated in order i.e. the next COMMIT event will always be for the uncommitted region with the lowest address. G1HR UNCOMMIT [0x72700000,0x72800000]G1HR UNCOMMIT [0x72600000,0x72700000] Opposite to COMMIT. The heap got shrunk at the end of a Full GC and the regions are being uncommitted. Like COMMIT, UNCOMMIT events are also generated in order i.e. the next UNCOMMIT event will always be for the committed region with the highest address. GC Cycle events G1HR #StartGC 7G1HR CSET 0x6e900000G1HR REUSE 0x70500000G1HR ALLOC(Old) 0x6f800000G1HR RETIRE 0x6f800000 0x6f821b20G1HR #EndGC 7 This shows start and end of an Evacuation pause. This event is followed by a GC counter tracking both evacuation pauses and Full GCs. Here, this is the 7th GC since the start of the process. G1HR #StartFullGC 17G1HR UNCOMMIT [0x6ed00000,0x6ee00000]G1HR POST-COMPACTION(Old) 0x6e800000 0x6e854f58G1HR #EndFullGC 17 Shows start and end of a Full GC. This event is also followed by the same GC counter as above. This is the 17th GC since the start of the process. ALLOC events G1HR ALLOC(Eden) 0x6e800000 The region with bottom 0x6e800000 just started being used for allocation. In this case it is an Eden region and allocated into by a mutator thread. G1HR ALLOC(StartsH) 0x6ec00000 0x6ed00000G1HR ALLOC(ContinuesH) 0x6ed00000 0x6e000000 Regions being used for the allocation of Humongous object. The object spans over two regions. G1HR ALLOC(SingleH) 0x6f900000 0x6f9eb010 Single region being used for the allocation of Humongous object. G1HR COMMIT [0x6ee00000,0x6ef00000]G1HR COMMIT [0x6ef00000,0x6f000000]G1HR COMMIT [0x6f000000,0x6f100000]G1HR COMMIT [0x6f100000,0x6f200000]G1HR ALLOC(StartsH) 0x6ee00000 0x6ef00000G1HR ALLOC(ContinuesH) 0x6ef00000 0x6f000000G1HR ALLOC(ContinuesH) 0x6f000000 0x6f100000G1HR ALLOC(ContinuesH) 0x6f100000 0x6f102010 Here, Humongous object allocation request could not be satisfied by the free committed regions that existed in the heap, so the heap needed to be expanded. Thus new regions are committed and then allocated into for the Humongous object. G1HR ALLOC(Old) 0x6f800000 Old region started being used for allocation during GC. G1HR ALLOC(Survivor) 0x6fa00000 Region being used for copying old objects into during a GC. Note that Eden and Humongous ALLOC events are generated outside the GC boundaries and Old and Survivor ALLOC events are generated inside the GC boundaries. Other Events G1HR RETIRE 0x6e800000 0x6e87bd98 Retire and stop using the region having bottom 0x6e800000 and top 0x6e87bd98 for allocation. Note that most regions are full when they are retired and we omit those events to reduce the output volume. A region is retired when another region of the same type is allocated or we reach the start or end of a GC(depending on the region). So for Eden regions: For example: 1. ALLOC(Eden) Foo2. ALLOC(Eden) Bar3. StartGC At point 2, Foo has just been retired and it was full. At point 3, Bar was retired and it was full. If they were not full when they were retired, we will have a RETIRE event: 1. ALLOC(Eden) Foo2. RETIRE Foo top3. ALLOC(Eden) Bar4. StartGC G1HR CSET 0x6e900000 Region (bottom: 0x6e900000) is selected for the Collection Set. The region might have been selected for the collection set earlier (i.e. when it was allocated). However, we generate the CSET events for all regions in the CSet at the start of a GC to make sure there's no confusion about which regions are part of the CSet. G1HR POST-COMPACTION(Old) 0x6e800000 0x6e839858 POST-COMPACTION event is generated for each non-empty region in the heap after a full compaction. A full compaction moves objects around, so we don't know what the resulting shape of the heap is (which regions were written to, which were emptied, etc.). To deal with this, we generate a POST-COMPACTION event for each non-empty region with its type (old/humongous) and the heap boundaries. At this point we should only have Old and Humongous regions, as we have collapsed the young generation, so we should not have eden and survivors. POST-COMPACTION events are generated within the Full GC boundary. G1HR CLEANUP 0x6f400000G1HR CLEANUP 0x6f300000G1HR CLEANUP 0x6f200000 These regions were found empty after remark phase of Concurrent Marking and are reclaimed shortly afterwards. G1HR #StartGC 5G1HR CSET 0x6f400000G1HR CSET 0x6e900000G1HR REUSE 0x6f800000 At the end of a GC we retire the old region we are allocating into. Given that its not full, we will carry on allocating into it during the next GC. This is what REUSE means. In the above case 0x6f800000 should have been the last region with an ALLOC(Old) event during the previous GC and should have been retired before the end of the previous GC. G1HR ALLOC-FORCE(Eden) 0x6f800000 A specialization of ALLOC which indicates that we have reached the max desired number of the particular region type (in this case: Eden), but we decided to allocate one more. Currently it's only used for Eden regions when we extend the young generation because we cannot do a GC as the GC-Locker is active. G1HR EVAC-FAILURE 0x6f800000 During a GC, we have failed to evacuate an object from the given region as the heap is full and there is no space left to copy the object. This event is generated within GC boundaries and exactly once for each region from which we failed to evacuate objects. When Heap Regions are reclaimed ? It is also worth mentioning when the heap regions in the G1 heap are reclaimed. All regions that are in the CSet (the ones that appear in CSET events) are reclaimed at the end of a GC. The exception to that are regions with EVAC-FAILURE events. All regions with CLEANUP events are reclaimed. After a Full GC some regions get reclaimed (the ones from which we moved the objects out). But that is not shown explicitly, instead the non-empty regions that are left in the heap are printed out with the POST-COMPACTION events.

    Read the article

  • Error correlation ID when trying to create a BI center site in Sharepoint 2010

    - by Patrick Olurotimi Ige
    Before you get to see the template to install the BI center site you will have to activate the PerformancePoint Services Site Collection Features from the : Site Collection Administration  > site collection features But after i activated it and try to create a site i get correlation error bla bla... After looking at the ULS log files i saw something related to not being able to use the SharePoint Server Publishing. So i went to the collection features and activate the SharePoint Server Publishing Infrastructure I could create a BI site

    Read the article

  • apt-get does not install dependencies

    - by Saran
    When i try to install the gnutls libraries (libgnutls26), and the generic linux kernel headers (linux-headers-generic), I get the following error: The following packages have unmet dependencies: libgnutls26: Depends: libc6 (>= 2.14) but 2.15-0ubuntu10.4 is installed Depends: zlib1g (>= 1:1.1.4) but 1:1.2.3.4.dfsg-3ubuntu4 is installed libgnutls26:i386: Depends: zlib1g (>= 1:1.1.4) but 1:1.2.3.4.dfsg-3ubuntu4 is installed linux-headers-generic: Depends: linux-headers-3.2.0-41-generic but it is not installed How can I fix this error?

    Read the article

  • Kernel error during upgrade due to "/etc/default/grub: Syntax error: newline unexpected"

    - by Patrick - Developer
    Summary: linux-image-3.5.0-2-generic upgrade to linux-image-3.5.0-3-generic The default Ubuntu 12.04 update is generating the following error for weeks (the link below). Obs.: I'm using default update of Ubuntu 12.04 ie, apt-get update. log error: https://gist.github.com/3036775 Overall he is trying to do the following: upgrade the "linux-image-3.5.0-2-generic upgrade to linux-image-3.5.0-3-generic" and the error always, always. What to do?

    Read the article

  • Is this a clean way to manage AsyncResults with Generic Methods?

    - by Michael Stum
    I've contributed Async Support to a Project I'm using, but I made a bug which I'm trying to fix. Basically I have this construct: private readonly Dictionary<WaitHandle, object> genericCallbacks = new Dictionary<WaitHandle, object>(); public IAsyncResult BeginExecute<T>(RestRequest request, AsyncCallback callback, object state) where T : new() { var genericCallback = new RequestExecuteCaller<T>(this.Execute<T>); var asyncResult = genericCallback.BeginInvoke(request, callback, state); genericCallbacks[asyncResult.AsyncWaitHandle] = genericCallback; return asyncResult; } public RestResponse<T> EndExecute<T>(IAsyncResult asyncResult) where T : new() { var cb = genericCallbacks[asyncResult.AsyncWaitHandle] as RequestExecuteCaller<T>; genericCallbacks.Remove(asyncResult.AsyncWaitHandle); return cb.EndInvoke(asyncResult); } So I have a generic BeginExecute/EndExecute method pair. As I need to store the delegate that is called on EndExecute somewhere I created a dictionary. I'm unsure about using WaitHandles as keys though, but that seems to be the only safe choice. Does this approach make sense? Are WaitHandles unique or could I have two equal ones? Or should I instead use the State (and wrap any user provided state into my own State value)? Just to add, the class itself is non-generic, only the Begin/EndExecute methods are generic.

    Read the article

  • How to filter Many2Many / Generic Relations properly with Q?

    - by HWM-Rocker
    Hi, I have 3 Models, the TaggedObject has a GenericRelation with the ObjectTagBridge. And the ObjectTagBridge has a ForeignKey to the Tag Model. class TaggedObject(models.Model): """ class that represent a tagged object """ tags = generic.GenericRelation('ObjectTagBridge', blank=True, null=True) class ObjectTagBridge(models.Model): """ Help to connect a generic object to a Tag. """ # pylint: disable-msg=W0232,R0903 content_type = models.ForeignKey(ContentType) object_id = models.PositiveIntegerField() content_object = generic.GenericForeignKey('content_type', 'object_id') tag = models.ForeignKey('Tag') class Tag(models.Model): ... when I am attaching a Tag to an Object, I am creating a new ObjectTagBridge and set its ForeignKey tag to the Tag I want to attach. That is working fine, and I can get all Tags that I attached to my Object very easy. But when I want to get (filter) all Objects that have Tag1 and Tag2 I tried to something like this: query = Q(tags__tag=Tag1) & Q(tags__tag=Tag2) object_list = TaggedObjects.filter(query) but now my object_list is empty, because it is looking for TaggedObjects that have one ObjectTagBridge with 2 tag objects, the first with Tag1 and the second with Tag2. I my application will be more complex Q queries than this one, so I think I need a solution with this Q object. In fact any combination of binary conjunctions, like: (...) and ( (...) or not(...)) How can I filter this correctly? Every answer is welcome, maybe there is also a different way do achieve this. thx for your help!!!

    Read the article

  • Efficient way to copy a collection of Nodes, treat them, and then serialize?

    - by Danjah
    Hi all, I initially thought a regex to remove YUI3 classNames (or whole class attributes) and id attributes from a serialized DOM string was a sound enough approach - but now I'm not sure, given various warnings about using regex on HTML. I'm toying with the idea of making a copy of the DOM structure in question, performing: var nodeStructure = Y.one('#wrap').all('*'); // A YUI3 NodeList // Remove unwanted classNames.. I'd need to maintain a list of them to remove :/ nodeStructure.removeClass('unwantedClassName'); and then: // I believe this can be done on a NodeList collection... nodeStructure.removeAttribute('id'); I'm not quite sure about what I'd need to do to 'copy' a collection of Nodes anyway, as I don't actually want to do the above to my living markup, as its only being saved - not 'closed' or 'exited', a user could continue to change the markup, and then save again. The above doesn't make a copy, I know. Is this efficient? Is there a better way to 'sanitize' my live markup of framework additions to the DOM (and maybe other things too at a later point), before saving it as a string? If it is a good approach, what's a safe way to go about copying my collection of Nodes for safe cleaning? Thanks! d

    Read the article

  • Presenting collection of structures of strings in grid or similar in WPF - Example? Ideas?

    - by Andrew
    Hi, I have a collection of structures. The structure is just some strings. Example public struct ReportLine { public string Name; public string Address; public string Phone; ...// about 10 other strings } I can't change this part. What I want to do is display it in a simple grid or simlar in WPF. My only requirements are: a) need column headers b) rows must alternate in color c) columns big enough to hold largest datum (which is not know until run time) Can someone point me to an example to get me started? Is the GridView the way to go? Or DataGrid? Or perhaps just the grid? I have the book Pro WPF in C# 2008 and it covers binding ListBox's to collections, but the collections always seem to be collections of one field (ex. a collection of 40 names). Here I have a collection (an array in fact) of a structure. How do I setup the databinding? As you can see, I'm new to this and probably would most benefit from a reference to an article. I've found articles covering collections of 1 field, but no examples covering binding to an array of structures. Also, my intitial research indicates c) can't be easily done, if you demand that the column is big enough for all the data in that column, not just the visible data. Thanks, dave

    Read the article

  • How can data templates in generic.xaml get applied automatically?

    - by Thiado de Arruda
    I have a custom control that has a ContentPresenter that will have an arbitrary object set as it content. This object does not have any constraint on its type, so I want this control to display its content based on any data templates defined by application or by data templates defined in Generic.xaml. If in a application I define some data template(without a key because I want it to be applied automatically to objects of that type) and I use the custom control bound to an object of that type, the data template gets applied automatically. But I have some data templates defined for some types in the generic.xaml where I define the custom control style, and these templates are not getting applied automatically. Here is the generic.xaml : If I set an object of type 'PredefinedType' as the content in the contentpresenter, the data template does not get applied. However, If it works if I define the data template in the app.xaml for the application thats using the custom control. Does someone got a clue? I really cant assume that the user of the control will define this data template, so I need some way to tie it up with the custom control.

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >