Search Results

Search found 60622 results on 2425 pages for 'linked data'.

Page 170/2425 | < Previous Page | 166 167 168 169 170 171 172 173 174 175 176 177  | Next Page >

  • CD-R suddenly unreadable?

    - by TheD
    I have a CD-R which has worked fine for a good while. All of a sudden, Windows can no longer read its data. It's not scratched, it's clean and Windows can detect the disk and usage of space on it. But whenever I attempt to access the data stored: Explorer crashes Or after 5-10 mins of trying to read it, it opens up and just shows a desktop.ini file in Explorer. This happens on multiple machines. Any ideas? Is there a way to recover the data? For example - some sector by sector recovery software if any exist for CD's?

    Read the article

  • Merge Publication Data Partition Snapshot generation

    - by David Osborn
    I have a merge publication in SQL 2008 R2 with data partitions and I am wondering when I should generate the snapshots for the data partitions. I get an error sometimes when bringing a new subscribe online related to the partition snapshot being out of date and I am wondering if this has to do with the publication snapshot being scheduled at the same time as the partition snapshots. I'm not really sure how the partition snapshot gets generated, but it appears maybe it is getting generated before the publication snapshot. If this is the case how should I be scheduling the partition snapshots to get generated? Should I set them to run x number of minutes after the publication snapshot? this seems kind of poor to do in case the publication snapshot takes awhile or fails. It seems to me the publication snapshot should just run the data partition snapshot agents when it is done if this is the case.

    Read the article

  • How to link data in different worksheets

    - by user2961726
    I tried consolidation but I can not get the following to work as it keeps saying no data consolidated. Can somebody try this dummy application and if they figure out how to do the following below can give me a step by step guide so I can attempt myself to learn. I'm not sure if I need to use any coding for this: In the dummy application I have 2 worksheets. One known as "1st", the other "Cases". In the "1st" worksheet you can insert and delete records for the "Case" table at the bottom, what I want to do is insert a row into the Case Table in worksheet "1st" and enter in the data for that row. What should happen is that data should be automatically be updated in the table in the "Cases" worksheet. But I can't seem to get this to work. Also if I delete a row from the table in Worksheet "1st" it should automatically remove that record from the "Cases" worksheet table. Please help. Below is the spreadsheet: http://ge.tt/8sjdkVx/v/0

    Read the article

  • How to backup metro "app" data, manually

    - by ihateapps
    I'm a PC tech and I have been getting more and more Windows 8 issues. First off I hate metro, and secondly I hate "apps". I like to do fresh installs of Windows8. I backup my user's data and then fresh install. Assuming there is no recovery partition how can I manually backup the user's app data so that in the event of a reformat I simply redownload their apps (dumb, I wish there was a way to actually back them up too) and plop the data back in. Do most normal people even use "apps", or do they use Windows8 like a desktop OS? I totally dropped metro with ClassicShell.

    Read the article

  • Python: how to calculate data received and send between two ipaddresses and ports [closed]

    - by ramdaz
    I guess it's socket programming. But I have never done socket programming expect for running the tutorial examples while learning Python. I need some more ideas to implement this. What I specifically need is to run a monitoring program of a server which will poll or listen to traffic being exchange from different IPs across different popular ports. For example, how do I get data received and sent through port 80 of 192.168.1.10 and 192.168.1.1 ( which is the gateway). I checked out a number of ready made tools like MRTG, Bwmon, Ntop etc but since we are looking at doing some specific pattern studies, we need to do data capturing within the program. Idea is to monitor some popular ports and do a study of network traffic across some periods and compare them with some other data. We would like to figure a way to do all this with Python....

    Read the article

  • Data CD for audiobooks?

    - by Marco7757
    I'm trying to burn my .mb4-audiobook files to a CD. I was impressed by the compression-rate (10 hours of audiobook within 150MB?!). The problem now is, that I cannot burn it as an audio CD as these allow only about 80 minutes of audio (audiobook is about +10 hours). I burned them as a data CD now. It works, but, of course, the downside of a data CD is, that not every player (e.g. car, stereo) can play data CDs. What can I do? I don't want to waste 100 CDs on such a simple problem ... is there any way to burn an audio CD? I mean, just regarding the filesize this shouldn't be a problem, shouldn't it? Why is an audio CD only able to play up to 80 minutes?

    Read the article

  • Best alternatives to recover lost directories in FAT32 external hard drive?

    - by Sergio
    I have an 320 GB ADATA CH91 external hard drive. I guess it has some problems with the connector of the USB jack. The point is that in certain occasions it fails in write operations generating data losses. Right now I lost a directory with several GB's of very useful information. Since then I have not attempted to write to the disk any more. What tool would you recommend to recover the lost data? The disk is FAT32 formatted (only one partition) and I use both Linux and Windows. What filesystem format would you recommend to avoid future data losses? I currently only use this external hard drive in Linux so there are several available choices (FAT, NTFS, ext3, ext4, reiser, etc.).

    Read the article

  • External Hard Drive needs format problem

    - by Saher
    I recently bought a new ADATA external Classic hard drive 500GB. I have transferred around 29GB of data on it till I install my new windows 7 operating system. After some work with the hard drive (copying / deleting ... files) . I closed it for some reason and it couldn't open again asking me to format. I don't want to format the hard drive, I have important data I need...Is there a way I can retrieve my data. Is Recover My Files program from GetData a right choice??? part 2 of my question: why might such thing happen (require format to open), is it the hard drive problem or is it just a corrupted file or folder...??? Thanks,

    Read the article

  • Host data transfer limit calculations and network protocol headers

    - by UpTheCreek
    OK, this might be a really stupid question, but... I'm building a web app that utilises websockets. There's fairly rapid messaging going on, so I've been taking a look at the network traffic with wireshark, to see if there's any way of reducing the amount of data we are sending over the wire, and hence costs. A typical message has approx 150 byte data payload, and according to wireshark the lower layer stuff takes up about: Ethernet: 14 bytes IP: 20 Bytes TCP: 20 Bytes My question is, are these network headers included in data transfer calculations? What about TCP ACK messages? (another 54 bytes according to wireshark) This may seem petty, but because we have so much messaging going on, and because the payload is a similar size to these headers, it's significant.

    Read the article

  • Is there a Utility that will scan selected locations and return all files older than a certain date?

    - by CT
    Can anyone recommend a utility that can scan specified directory locations (network shares specifically) and return all files older than a certain date? I am looking to implement a data retention policy at my workplace. As our amount of data grows it puts a large strain on our backup routines. I would like to move old data to some sort of archival system. Extra points for the ability to move queried old files to another location for archival and the ability to create schedules for when this occurs. Many thanks. EDIT: Windows Shop. Mostly Windows 2003 Servers.

    Read the article

  • Moving to Data Center

    - by Won
    Please, give me any advice. Our company has decided to move servers to a data center since we are having a major network traffic jam. The data center provides 100MB bandwidth and full 42 unit cabinet for us. Right now, I am planing to have two firewalls for a failover and changing DNS informations for a web server. Is there anything that I have aware of before I move them to the data center? 1. Web Server 2. Exchange Server 3. SQL Servers

    Read the article

  • Enterprise Data Center System Admin/Engineer to Server Ratio

    - by Bob
    I know there have been similar questions asked over the last few months however looking at a Data Center Operations and know there are some really smart people out there that might be able to help. Looking for some staffing best practices based on first hand experience and was hoping that there is some experience in this area that can provide "best practice" application: Three High Availability (99.99% plus) Enterprise Level Data Centers geographically dislocated, one manned 24x7x365, one lights out, one co-location running HOT-HOT-HOT supporting a global community. More than 2,000 operating systems consisting of 95% Windows, 5% Linux and Solaris, 45% virtualized, more than 100TB storage. No desktop support, no Network Administration (administrated separately), running N+1 and serving more than 250 Billion page views annually. Based on experience what has been your experience with Server to "Data Center System Administrator/Engineer" ratio? Thanks in advance for your responses.

    Read the article

  • Data storage solutions for rapidly running out of space

    - by Grimlockz
    I have 2 web servers (1 live and other backup), the issue I have is our storage is rapidly running out. All the data on the server is used by our customers and new documents are uploaded to the server daily. So nothing can be deleted as it's always in use. We use a flat file structure with no database. I'm seeking solutions or ideas for the best place to move the our data to. The data has to be secure and needs to run on a linux environment. Not sure where to start - clusters, vmware, or they such solutions for huge file servers?

    Read the article

  • Wifi installation issues on Ubuntu 10.10

    - by SlyrNemesis
    Linux newbie here, anyway so here is the problem, I run Ubuntu 10.10 and I have a Sitecom 300N x2 Wireless Network dongle with chipset 8192SU, I used ndiswrapper to install my Windows Wireless driver because Sitecom doesn't have a linux driver, it says hardware present but it doesn't find any Wireless networks, nor does it connect to one. What can I do? The command "dmesg | grep ndis" gave this output in the terminal: [ 9.999954] ndiswrapper version 1.56 loaded (smp=yes, preempt=no) [ 11.111901] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateNetBufferAndNetBufferList' [ 11.111973] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMIndicateReceiveNetBufferLists' [ 11.112099] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMRegisterMiniportDriver' [ 11.112161] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateMdl' [ 11.112220] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMDeregisterMiniportDriver' [ 11.112280] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeNetBufferListPool' [ 11.112339] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateNetBufferListPool' [ 11.112399] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeMdl' [ 11.112457] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMAllocatePort' [ 11.112515] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMNetPnPEvent' [ 11.112573] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMFreePort' [ 11.112631] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMSendNetBufferListsComplete' [ 11.112780] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMSetMiniportAttributes' [ 11.112848] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisOpenConfigurationEx' [ 11.112946] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMIndicateStatusEx' [ 11.113017] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMOidRequestComplete' [ 11.113112] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateMemoryWithTagPriority' [ 11.113200] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateIoWorkItem' [ 11.113271] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeIoWorkItem' [ 11.113342] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisQueueIoWorkItem' [ 11.113413] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeNetBufferList' [ 11.113481] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionBind' [ 11.113547] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionBindClass' [ 11.113613] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionUnbindClass' [ 11.113680] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionUnbind' [ 11.113742] ndiswrapper (load_sys_files:206): couldn't prepare driver 'net8192su' [ 11.148888] ndiswrapper (load_wrap_driver:108): couldn't load driver net8192su; check system log for messages from 'loadndisdriver' [ 11.365200] usbcore: registered new interface driver ndiswrapper [ 12.818573] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.819183] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.819796] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.820505] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.821115] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.821726] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.822339] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.822948] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.823560] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.824204] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy

    Read the article

  • Wifi installation issues with a Sitecom 300N x2 Wireless Network dongle

    - by SlyrNemesis
    Linux newbie here, anyway so here is the problem, I run Ubuntu 10.10 and I have a Sitecom 300N x2 Wireless Network dongle with chipset 8192SU, I used ndiswrapper to install my Windows Wireless driver because Sitecom doesn't have a linux driver, it says hardware present but it doesn't find any Wireless networks, nor does it connect to one. What can I do? The command "dmesg | grep ndis" gave this output in the terminal: [ 9.999954] ndiswrapper version 1.56 loaded (smp=yes, preempt=no) [ 11.111901] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateNetBufferAndNetBufferList' [ 11.111973] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMIndicateReceiveNetBufferLists' [ 11.112099] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMRegisterMiniportDriver' [ 11.112161] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateMdl' [ 11.112220] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMDeregisterMiniportDriver' [ 11.112280] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeNetBufferListPool' [ 11.112339] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateNetBufferListPool' [ 11.112399] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeMdl' [ 11.112457] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMAllocatePort' [ 11.112515] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMNetPnPEvent' [ 11.112573] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMFreePort' [ 11.112631] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMSendNetBufferListsComplete' [ 11.112780] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMSetMiniportAttributes' [ 11.112848] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisOpenConfigurationEx' [ 11.112946] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMIndicateStatusEx' [ 11.113017] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisMOidRequestComplete' [ 11.113112] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateMemoryWithTagPriority' [ 11.113200] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisAllocateIoWorkItem' [ 11.113271] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeIoWorkItem' [ 11.113342] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisQueueIoWorkItem' [ 11.113413] ndiswrapper (import:233): unknown symbol: NDIS.SYS:'NdisFreeNetBufferList' [ 11.113481] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionBind' [ 11.113547] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionBindClass' [ 11.113613] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionUnbindClass' [ 11.113680] ndiswrapper (import:233): unknown symbol: WDFLDR.SYS:'WdfVersionUnbind' [ 11.113742] ndiswrapper (load_sys_files:206): couldn't prepare driver 'net8192su' [ 11.148888] ndiswrapper (load_wrap_driver:108): couldn't load driver net8192su; check system log for messages from 'loadndisdriver' [ 11.365200] usbcore: registered new interface driver ndiswrapper [ 12.818573] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.819183] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.819796] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.820505] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.821115] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.821726] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.822339] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.822948] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.823560] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy [ 12.824204] Modules linked in: snd_wavefront snd_emu10k1(+) snd_cs4236 snd_usb_audio snd_wss_lib snd_opl3_lib snd_ac97_codec ac97_bus nouveau(+) snd_pcm i915 snd_usbmidi_lib snd_util_mem snd_page_alloc snd_hwdep snd_mpu401 snd_mpu401_uart snd_seq_midi snd_rawmidi ttm snd_seq_midi_event drm_kms_helper snd_seq ppdev snd_timer snd_seq_device drm ndiswrapper snd parport_pc emu10k1_gp intel_agp ns558 gameport soundcore i2c_algo_bit shpchp lp video output agpgart parport usbhid hid 8139too 8139cp mii floppy

    Read the article

  • Solving Big Problems with Oracle R Enterprise, Part II

    - by dbayard
    Part II – Solving Big Problems with Oracle R Enterprise In the first post in this series (see https://blogs.oracle.com/R/entry/solving_big_problems_with_oracle), we showed how you can use R to perform historical rate of return calculations against investment data sourced from a spreadsheet.  We demonstrated the calculations against sample data for a small set of accounts.  While this worked fine, in the real-world the problem is much bigger because the amount of data is much bigger.  So much bigger that our approach in the previous post won’t scale to meet the real-world needs. From our previous post, here are the challenges we need to conquer: The actual data that needs to be used lives in a database, not in a spreadsheet The actual data is much, much bigger- too big to fit into the normal R memory space and too big to want to move across the network The overall process needs to run fast- much faster than a single processor The actual data needs to be kept secured- another reason to not want to move it from the database and across the network And the process of calculating the IRR needs to be integrated together with other database ETL activities, so that IRR’s can be calculated as part of the data warehouse refresh processes In this post, we will show how we moved from sample data environment to working with full-scale data.  This post is based on actual work we did for a financial services customer during a recent proof-of-concept. Getting started with the Database At this point, we have some sample data and our IRR function.  We were at a similar point in our customer proof-of-concept exercise- we had sample data but we did not have the full customer data yet.  So our database was empty.  But, this was easily rectified by leveraging the transparency features of Oracle R Enterprise (see https://blogs.oracle.com/R/entry/analyzing_big_data_using_the).  The following code shows how we took our sample data SimpleMWRRData and easily turned it into a new Oracle database table called IRR_DATA via ore.create().  The code also shows how we can access the database table IRR_DATA as if it was a normal R data.frame named IRR_DATA. If we go to sql*plus, we can also check out our new IRR_DATA table: At this point, we now have our sample data loaded in the database as a normal Oracle table called IRR_DATA.  So, we now proceeded to test our R function working with database data. As our first test, we retrieved the data from a single account from the IRR_DATA table, pull it into local R memory, then call our IRR function.  This worked.  No SQL coding required! Going from Crawling to Walking Now that we have shown using our R code with database-resident data for a single account, we wanted to experiment with doing this for multiple accounts.  In other words, we wanted to implement the split-apply-combine technique we discussed in our first post in this series.  Fortunately, Oracle R Enterprise provides a very scalable way to do this with a function called ore.groupApply().  You can read more about ore.groupApply() here: https://blogs.oracle.com/R/entry/analyzing_big_data_using_the1 Here is an example of how we ask ORE to take our IRR_DATA table in the database, split it by the ACCOUNT column, apply a function that calls our SimpleMWRR() calculation, and then combine the results. (If you are following along at home, be sure to have installed our myIRR package on your database server via  “R CMD INSTALL myIRR”). The interesting thing about ore.groupApply is that the calculation is not actually performed in my desktop R environment from which I am running.  What actually happens is that ore.groupApply uses the Oracle database to perform the work.  And the Oracle database is what actually splits the IRR_DATA table by ACCOUNT.  Then the Oracle database takes the data for each account and sends it to an embedded R engine running on the database server to apply our R function.  Then the Oracle database combines all the individual results from the calls to the R function. This is significant because now the embedded R engine only needs to deal with the data for a single account at a time.  Regardless of whether we have 20 accounts or 1 million accounts or more, the R engine that performs the calculation does not care.  Given that normal R has a finite amount of memory to hold data, the ore.groupApply approach overcomes the R memory scalability problem since we only need to fit the data from a single account in R memory (not all of the data for all of the accounts). Additionally, the IRR_DATA does not need to be sent from the database to my desktop R program.  Even though I am invoking ore.groupApply from my desktop R program, because the actual SimpleMWRR calculation is run by the embedded R engine on the database server, the IRR_DATA does not need to leave the database server- this is both a performance benefit because network transmission of large amounts of data take time and a security benefit because it is harder to protect private data once you start shipping around your intranet. Another benefit, which we will discuss in a few paragraphs, is the ability to leverage Oracle database parallelism to run these calculations for dozens of accounts at once. From Walking to Running ore.groupApply is rather nice, but it still has the drawback that I run this from a desktop R instance.  This is not ideal for integrating into typical operational processes like nightly data warehouse refreshes or monthly statement generation.  But, this is not an issue for ORE.  Oracle R Enterprise lets us run this from the database using regular SQL, which is easily integrated into standard operations.  That is extremely exciting and the way we actually did these calculations in the customer proof. As part of Oracle R Enterprise, it provides a SQL equivalent to ore.groupApply which it refers to as “rqGroupEval”.  To use rqGroupEval via SQL, there is a bit of simple setup needed.  Basically, the Oracle Database needs to know the structure of the input table and the grouping column, which we are able to define using the database’s pipeline table function mechanisms. Here is the setup script: At this point, our initial setup of rqGroupEval is done for the IRR_DATA table.  The next step is to define our R function to the database.  We do that via a call to ORE’s rqScriptCreate. Now we can test it.  The SQL you use to run rqGroupEval uses the Oracle database pipeline table function syntax.  The first argument to irr_dataGroupEval is a cursor defining our input.  You can add additional where clauses and subqueries to this cursor as appropriate.  The second argument is any additional inputs to the R function.  The third argument is the text of a dummy select statement.  The dummy select statement is used by the database to identify the columns and datatypes to expect the R function to return.  The fourth argument is the column of the input table to split/group by.  The final argument is the name of the R function as you defined it when you called rqScriptCreate(). The Real-World Results In our real customer proof-of-concept, we had more sophisticated calculation requirements than shown in this simplified blog example.  For instance, we had to perform the rate of return calculations for 5 separate time periods, so the R code was enhanced to do so.  In addition, some accounts needed a time-weighted rate of return to be calculated, so we extended our approach and added an R function to do that.  And finally, there were also a few more real-world data irregularities that we needed to account for, so we added logic to our R functions to deal with those exceptions.  For the full-scale customer test, we loaded the customer data onto a Half-Rack Exadata X2-2 Database Machine.  As our half-rack had 48 physical cores (and 96 threads if you consider hyperthreading), we wanted to take advantage of that CPU horsepower to speed up our calculations.  To do so with ORE, it is as simple as leveraging the Oracle Database Parallel Query features.  Let’s look at the SQL used in the customer proof: Notice that we use a parallel hint on the cursor that is the input to our rqGroupEval function.  That is all we need to do to enable Oracle to use parallel R engines. Here are a few screenshots of what this SQL looked like in the Real-Time SQL Monitor when we ran this during the proof of concept (hint: you might need to right-click on these images to be able to view the images full-screen to see the entire image): From the above, you can notice a few things (numbers 1 thru 5 below correspond with highlighted numbers on the images above.  You may need to right click on the above images and view the images full-screen to see the entire image): The SQL completed in 110 seconds (1.8minutes) We calculated rate of returns for 5 time periods for each of 911k accounts (the number of actual rows returned by the IRRSTAGEGROUPEVAL operation) We accessed 103m rows of detailed cash flow/market value data (the number of actual rows returned by the IRR_STAGE2 operation) We ran with 72 degrees of parallelism spread across 4 database servers Most of our 110seconds was spent in the “External Procedure call” event On average, we performed 8,200 executions of our R function per second (110s/911k accounts) On average, each execution was passed 110 rows of data (103m detail rows/911k accounts) On average, we did 41,000 single time period rate of return calculations per second (each of the 8,200 executions of our R function did rate of return calculations for 5 time periods) On average, we processed over 900,000 rows of database data in R per second (103m detail rows/110s) R + Oracle R Enterprise: Best of R + Best of Oracle Database This blog post series started by describing a real customer problem: how to perform a lot of calculations on a lot of data in a short period of time.  While standard R proved to be a very good fit for writing the necessary calculations, the challenge of working with a lot of data in a short period of time remained. This blog post series showed how Oracle R Enterprise enables R to be used in conjunction with the Oracle Database to overcome the data volume and performance issues (as well as simplifying the operations and security issues).  It also showed that we could calculate 5 time periods of rate of returns for almost a million individual accounts in less than 2 minutes. In a future post, we will take the same R function and show how Oracle R Connector for Hadoop can be used in the Hadoop world.  In that next post, instead of having our data in an Oracle database, our data will live in Hadoop and we will how to use the Oracle R Connector for Hadoop and other Oracle Big Data Connectors to move data between Hadoop, R, and the Oracle Database easily.

    Read the article

  • Standard Network Tiers in a Distributed N-Tier System

    Distributed N-Tier client/server architecture allows for segments of an application to be broken up and distributed across multiple locations on a network.  Listed below are standard tiers in a Distributed N-Tier System. End-User Client Tier The End-User Client is responsible for sending and receiving requests from web servers and other applications servers and translating the responses so that the End-User can interpret the data effectively. The primary roles for this tier are to communicate with servers and translate server responses back to the end-user to interpret. Business-Specific Functions Validate Data Display Data Send Data to Webserver Web Server Tier The Web server tier processes new requests for information coming in from the HTTP and HTTPS ports. This primarily handles the generation of user interfaces and calls the application server when needed to access Data and business logic when needed. Business-specific functions Send Data to application server Format Data for Display Validate Data Application Server Tier The application server stores and executes predefined business logic that is applied to various pieces of data as the business determines. The processed data is then returned back to the Webserver. Additionally, this server directly calls the database to obtain and store any data used by the system Business-Specific Functions Validate Data Process Data Send Data to Database Server Database Server Tier The Database Server is responsible for storing and returning all data need by the calling applications. The primary role for this this server is storage. Data is stored as needed and can be recalled at any point later in time. Business-Specific Functions Insert Data Delete Data Return Data to Application Server

    Read the article

  • Cant bind data to a table view

    - by sudhakarilla
    Hi, I have retrieved data from Json URL and displayed it in a table view. I have also inlcuced a button in table view. On clicking the button the data must be transferred to a another table view. The problem is that i could send the data to a view and could display it on a label. But i couldnt bind the dat to table view ... Here's some of the code snippets... Buy Button... -(IBAction)Buybutton{ /* UIAlertView *alert =[[UIAlertView alloc]initWithTitle:@"thank u" message:@"products" delegate:nil cancelButtonTitle:@"ok" otherButtonTitles:nil]; [alert show]; [alert release];*/ Product *selectedProduct = [[data products]objectAtIndex:0]; CartViewController *cartviewcontroller = [[[CartViewController alloc] initWithNibName:@"CartViewController" bundle:nil]autorelease]; cartviewcontroller.product= selectedProduct; //NSString *productname=[product ProductName]; //[currentproducts setproduct:productname]; [self.view addSubview:cartviewcontroller.view]; } CartView... // Implement viewDidLoad to do additional setup after loading the view, typically from a nib. - (void)viewDidLoad { [super viewDidLoad]; data = [GlobalData SharedData]; NSMutableArray *prod =[[NSMutableArray alloc]init]; prod = [data products]; for(NSDictionary *product in prod) { Cart *myprod = [[Cart alloc]init]; myprod.Description = [product Description]; myprod.ProductImage =[product ProductImage]; myprod.ProductName = [product ProductName]; myprod.SalePrice = [product SalePrice]; [data.carts addObject:myprod]; [myprod release]; } Cart *cart = [[data carts]objectAtIndex:0]; NSString *productname=[cart ProductName]; self.label.text =productname; NSLog(@"carts"); } (NSInteger)numberOfSectionsInTableView:(UITableView *)tableView { return 1; } (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section { return [data.carts count]; } -(CGFloat)tableView:(UITableView *)tableView heightForRowAtIndexPath:(NSIndexPath *)indexPath { return 75; } (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath { NSLog(@"cellforrow"); static NSString *CellIdentifier = @"Cell"; ProductCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier]; if(cell ==nil) { cell = [[[ProductCell alloc]initWithFrame:CGRectZero reuseIdentifier:CellIdentifier]autorelease]; } NSUInteger row = [indexPath row]; Cart *cart = [[data carts]objectAtIndex:row]; cell.productNameLabel.text = [cart ProductName]; /*NSString *sale = [[NSString alloc]initWithFormat:@"SalePrice:%@",[cart SalePrice]]; cell.salePriceLabel.text = sale; cell.DescriptionLabel.text = [cart Description]; NSMutableString imageUrl =[NSMutableString string]; [imageUrl appendFormat:@"http://demo.s2commerce.net/DesktopModules/S2Commerce/Images/Products/%@",[product ProductImage]]; NSLog(@"imageurl:%@",imageUrl); NSString mapURL = [imageUrl stringByReplacingPercentEscapesUsingEncoding:NSASCIIStringEncoding]; NSData* imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:mapURL]]; UIImage* image = [[UIImage alloc]initWithData:imageData]; cell.productImageview.image = image; [imageData release]; [image release];*/ return cell; } I am also getting the following error in the console 2010-06-11 18:34:29.169 navigation[4109:207] * -[CartViewController tableView:numberOfRowsInSection:]: message sent to deallocated instance 0xcb4d4f90

    Read the article

  • Multiple data series in real time plot

    - by Gr3n
    Hi, I'm kind of new to Python and trying to create a plotting app for values read via RS232 from a sensor. I've managed (after some reading and copying examples online) to get a plot working that updates on a timer which is great. My only trouble is that I can't manage to get multiple data series into the same plot. Does anyone have a solution to this? This is the code that I've worked out this far: import os import pprint import random import sys import wx # The recommended way to use wx with mpl is with the WXAgg backend import matplotlib matplotlib.use('WXAgg') from matplotlib.figure import Figure from matplotlib.backends.backend_wxagg import FigureCanvasWxAgg as FigCanvas, NavigationToolbar2WxAgg as NavigationToolbar import numpy as np import pylab DATA_LENGTH = 100 REDRAW_TIMER_MS = 20 def getData(): return int(random.uniform(1000, 1020)) class GraphFrame(wx.Frame): # the main frame of the application def __init__(self): wx.Frame.__init__(self, None, -1, "Usart plotter", size=(800,600)) self.Centre() self.data = [] self.paused = False self.create_menu() self.create_status_bar() self.create_main_panel() self.redraw_timer = wx.Timer(self) self.Bind(wx.EVT_TIMER, self.on_redraw_timer, self.redraw_timer) self.redraw_timer.Start(REDRAW_TIMER_MS) def create_menu(self): self.menubar = wx.MenuBar() menu_file = wx.Menu() m_expt = menu_file.Append(-1, "&Save plot\tCtrl-S", "Save plot to file") self.Bind(wx.EVT_MENU, self.on_save_plot, m_expt) menu_file.AppendSeparator() m_exit = menu_file.Append(-1, "E&xit\tCtrl-X", "Exit") self.Bind(wx.EVT_MENU, self.on_exit, m_exit) self.menubar.Append(menu_file, "&File") self.SetMenuBar(self.menubar) def create_main_panel(self): self.panel = wx.Panel(self) self.init_plot() self.canvas = FigCanvas(self.panel, -1, self.fig) # pause button self.pause_button = wx.Button(self.panel, -1, "Pause") self.Bind(wx.EVT_BUTTON, self.on_pause_button, self.pause_button) self.Bind(wx.EVT_UPDATE_UI, self.on_update_pause_button, self.pause_button) self.hbox1 = wx.BoxSizer(wx.HORIZONTAL) self.hbox1.Add(self.pause_button, border=5, flag=wx.ALL | wx.ALIGN_CENTER_VERTICAL) self.vbox = wx.BoxSizer(wx.VERTICAL) self.vbox.Add(self.canvas, 1, flag=wx.LEFT | wx.TOP | wx.GROW) self.vbox.Add(self.hbox1, 0, flag=wx.ALIGN_LEFT | wx.TOP) self.panel.SetSizer(self.vbox) #self.vbox.Fit(self) def create_status_bar(self): self.statusbar = self.CreateStatusBar() def init_plot(self): self.dpi = 100 self.fig = Figure((3.0, 3.0), dpi=self.dpi) self.axes = self.fig.add_subplot(111) self.axes.set_axis_bgcolor('white') self.axes.set_title('Usart data', size=12) pylab.setp(self.axes.get_xticklabels(), fontsize=8) pylab.setp(self.axes.get_yticklabels(), fontsize=8) # plot the data as a line series, and save the reference # to the plotted line series # self.plot_data = self.axes.plot( self.data, linewidth=1, color="blue", )[0] def draw_plot(self): # redraws the plot xmax = len(self.data) if len(self.data) > DATA_LENGTH else DATA_LENGTH xmin = xmax - DATA_LENGTH ymin = 0 ymax = 4096 self.axes.set_xbound(lower=xmin, upper=xmax) self.axes.set_ybound(lower=ymin, upper=ymax) # enable grid #self.axes.grid(True, color='gray') # Using setp here is convenient, because get_xticklabels # returns a list over which one needs to explicitly # iterate, and setp already handles this. # pylab.setp(self.axes.get_xticklabels(), visible=True) self.plot_data.set_xdata(np.arange(len(self.data))) self.plot_data.set_ydata(np.array(self.data)) self.canvas.draw() def on_pause_button(self, event): self.paused = not self.paused def on_update_pause_button(self, event): label = "Resume" if self.paused else "Pause" self.pause_button.SetLabel(label) def on_save_plot(self, event): file_choices = "PNG (*.png)|*.png" dlg = wx.FileDialog( self, message="Save plot as...", defaultDir=os.getcwd(), defaultFile="plot.png", wildcard=file_choices, style=wx.SAVE) if dlg.ShowModal() == wx.ID_OK: path = dlg.GetPath() self.canvas.print_figure(path, dpi=self.dpi) self.flash_status_message("Saved to %s" % path) def on_redraw_timer(self, event): if not self.paused: newData = getData() self.data.append(newData) self.draw_plot() def on_exit(self, event): self.Destroy() def flash_status_message(self, msg, flash_len_ms=1500): self.statusbar.SetStatusText(msg) self.timeroff = wx.Timer(self) self.Bind( wx.EVT_TIMER, self.on_flash_status_off, self.timeroff) self.timeroff.Start(flash_len_ms, oneShot=True) def on_flash_status_off(self, event): self.statusbar.SetStatusText('') if __name__ == '__main__': app = wx.PySimpleApp() app.frame = GraphFrame() app.frame.Show() app.MainLoop()

    Read the article

  • How do I move a linked file on Unix?

    - by r3mbol
    I have a bunch of files in one directory and links to each one of those files in another directory. So ls -l looks something like this: lrwxrwxrwx 1 rembol rembol 89 Jan 25 10:00 copyright.txt -> /home/rembol/solr/target/deploy/data/core/copyright.txt lrwxrwxrwx 1 rembol rembol 92 Jan 25 10:00 jar-versions.xml -> /home/rembol/solr/target/deploy/data/core/jar-versions.xml lrwxrwxrwx 1 rembol rembol 85 Jan 25 10:00 lgpl.html -> /home/rembol/solr/target/deploy/data/core/lgpl.html lrwxrwxrwx 1 rembol rembol 79 Jan 25 10:00 lib -> /home/rembol/solr/target/deploy/data/core/lib lrwxrwxrwx 1 rembol rembol 87 Jan 25 10:00 readme.html -> /home/rembol/solr/target/deploy/data/core/readme.html drwxr-xr-x 3 rembol rembol 4096 Jan 25 10:00 server drwxr-xr-x 2 rembol rembol 4096 Jan 25 10:00 startup Now I want to move those linked files from /home/rembol/solr/target/deploy to /home/rembol/output/. If I do that my simply calling mv, links will break. I don't want to re-link each file separately, cause there are hundreds of them (they are generated automatically). Is there some clever way to move linked files, rather than writing a script that unlinks, moves and relinks recursively for each file in each subdirectory?

    Read the article

  • dojo.xhrPost and Zend Framwork action, no POST data, not using a form

    - by sims
    Hi all, I'm trying to send some data via dojo.xhrPost to an Zend Controller Action. I can see the data being sent in Firebug console. However, when inspecting the post data, the array is empty. I'm not sure if it is possible to send an arbitrary string of data via dojo.xhrPost without using a form. This is probably a very n00b mistake. In any case, I'll post my code here and see what you all think. In my layout script I have: <?php $sizeurl = $this->baseUrl() . '/account/uisize'; ?> function resizeText(multiplier) { if (document.body.style.fontSize == "") { document.body.style.fontSize = "1.0em"; } document.body.style.fontSize = parseFloat(document.body.style.fontSize) + (multiplier * 0.1) + "em"; var size = document.body.style.fontSize; var xhrArgs = { url: "<?= $sizeurl; ?>", postData: size, handleAs: "text" } dojo.xhrPost(xhrArgs); } Then my action is: public function uisizeAction() { $this->_helper->viewRenderer->setNoRender(); $this->_helper->layout->disableLayout(); print_r($_POST); $request = $this->getRequest(); if ($request->isXmlHttpRequest()) { $postdata = $request->getPost(); print_r($postdata); if ($postdata) { $user = new Application_Model_DbTable_User(); $user->updateSize($postdata); } } } I'm pretty sure that post data from a form is an array with the form elements' names as the keys. When looking at the dojo.xhrPost examples on the dojo campus web site (http://docs.dojocampus.org/dojo/xhrPost second one to be precise), it looks as if I can just send a string of data. How do I access this data from a Zend Controller Action? I'm using ZF 1.10 and Dojo 1.4.2 Thanks for your help! PS I'd try to ask on one of the related questions, but I cannot seem to comment.

    Read the article

  • How can I encrypt CoreData contents on an iPhone

    - by James A. Rosen
    I have some information I'd like to store statically encrypted on an iPhone application. I'm new to iPhone development, some I'm not terribly familiar with CoreData and how it integrates with the views. I have the data as JSON, though I can easily put it into a SQLITE3 database or any other backing data format. I'll take whatever is easiest (a) to encrypt and (b) to integrate with the iPhone view layer. The user will need to enter the password to decrypt the data each time the app is launched. The purpose of the encryption is to keep the data from being accessible if the user loses the phone. For speed reasons, I would prefer to encrypt and decrypt the entire file at once rather than encrypting each individual field in each row of the database. Note: this isn't the same idea as Question 929744, in which the purpose is to keep the user from messing with or seeing the data. The data should be perfectly transparent when in use. Also note: I'm willing to use SQLCipher to store the data, but would prefer to use things that already exist on the iPhone/CoreData framework rather than go through the lengthy build/integration process involved.

    Read the article

  • MooTools Event Delegation using HTML5 data attributes.

    - by Anurag
    Is it possible to have event delegation using the HTML5 data attributes in MooTools? The HTML structure I have is: ?<div id="parent"> <div>not selectable</div> <div data-selectable="true">selectable</div> <div>not selectable either.</div> <div data-selectable="true">also selectable</div> </div>???????????????????????????????????????????????????????????????????????? And I want to setup <div id="parent"> to listen to all clicks only on child elements that have the data-selected attribute. Please let me know if I'm doing something wrong: The events are being setup as: $("parent").addEvent("click:relay([data-selectable])", function(event, el) { alert(this.get('text')); }); but the click callback is fired on clicking all div's, not just the ones with a data-selectable attribute defined. You can see this example on http://jsfiddle.net/NUGD4/ A workaround is to adding this as a CSS class, which works with delegation but I would prefer to be able to use data-attributes as it's used throughout the application.

    Read the article

  • Data sharing amongst JPA Entities

    - by Nick
    Setup: I have a simple web app that has a handfull of forms, each on a separate page. These forms represent patient data. There is a one-to-one relationship between patient and all these forms/entities. Each form maps directly to a db table and a JPA entity, maybe not the best architecture but it works and is simple. Question: If form/entity A and form/entity B share a common chunk of data (one of more fields), what is the best way to handle that in JPA. I.E. - If the data gets inserted via form A, I need it to show up in form B as existing data and vice versa. In other words its logical for both entities to contain that data. I believe I will have to move the common data into its own entity and define the relationships that way, but I have tried many different ways and none gets me all the way, at least with basic JPA. Can this be done through pure JPA relationships or will I have to write a bunch of code to make this happen manually. Not looking for code specifically, just the correct way to model this data. Thanks.

    Read the article

  • Mono ASP.NET Oracle Connection

    - by bladepit
    Hello to everybody, if i want to connect to orcale i became the following error: libclntsh.so Description: HTTP 500. Error processing request. Stack Trace: System.DllNotFoundException: libclntsh.so at (wrapper managed-to-native) System.Data.OracleClient.Oci.OciCalls/OciNativeCalls.OCIEnvCreate (intptr&,System.Data.OracleClient.Oci.OciEnvironmentMode,intptr,intptr,intptr,intptr,int,intptr) <0x0005d at System.Data.OracleClient.Oci.OciCalls.OCIEnvCreate (intptr&,System.Data.OracleClient.Oci.OciEnvironmentMode,intptr,intptr,intptr,intptr,int,intptr) [0x00000] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient.Oci/OciCalls.cs:738 at System.Data.OracleClient.Oci.OciEnvironmentHandle..ctor (System.Data.OracleClient.Oci.OciEnvironmentMode) [0x00013] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient.Oci/OciEnvironmentHandle.cs:35 at System.Data.OracleClient.Oci.OciGlue.CreateConnection (System.Data.OracleClient.OracleConnectionInfo) [0x00000] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient/OciGlue.cs:86 at System.Data.OracleClient.OracleConnectionPoolManager.CreateConnection (System.Data.OracleClient.OracleConnectionInfo) [0x00006] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient/OracleConnectionPoolManager.cs:57 at System.Data.OracleClient.OracleConnectionPool.CreateConnection () [0x0000e] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient/OracleConnectionPool.cs:97 at System.Data.OracleClient.OracleConnectionPool.GetConnection () [0x000ba] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient/OracleConnectionPool.cs:74 at System.Data.OracleClient.OracleConnection.Open () [0x00061] in /src/monoscript/mono-2.4.2.3/mcs/class/System.Data.OracleClient/System.Data.OracleClient/OracleConnection.cs:410 at WebServer.Controllers.HomeController.Index () [0x00006] in /home/bhcweb/Projects/Controllers/HomeController.cs:19 at (wrapper dynamic-method) System.Runtime.CompilerServices.ExecutionScope.lambda_method (System.Runtime.CompilerServices.ExecutionScope,System.Web.Mvc.ControllerBase,object[]) <0x00080 at System.Web.Mvc.ActionMethodDispatcher.Execute (System.Web.Mvc.ControllerBase,object[]) <0x0001b at System.Web.Mvc.ReflectedActionDescriptor.Execute (System.Web.Mvc.ControllerContext,System.Collections.Generic.IDictionary2<string, object>) <0x000fd> at System.Web.Mvc.ControllerActionInvoker.InvokeActionMethod (System.Web.Mvc.ControllerContext,System.Web.Mvc.ActionDescriptor,System.Collections.Generic.IDictionary2) <0x0001c at System.Web.Mvc.ControllerActionInvoker/c_AnonStoreyB.<m_E () <0x00067 at System.Web.Mvc.ControllerActionInvoker.InvokeActionMethodFilter (System.Web.Mvc.IActionFilter,System.Web.Mvc.ActionExecutingContext,System.Func`1) <0x000c4 What is my Problem there? I have read that i have to set my ORACLE_HOME AND LD_LIBRARY_PATH. If i do echo $ORACLE_HOME and $LD_LIBRARY_PATH the path which i have set is coming out: /usr/lib/oracle/xe/app/oracle/product/10.2.0/client/lib This is the path where the libclntsh.so is in. Is this right? Best regards bladepit

    Read the article

< Previous Page | 166 167 168 169 170 171 172 173 174 175 176 177  | Next Page >