Search Results

Search found 68762 results on 2751 pages for 'data transfer objects'.

Page 51/2751 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • Lost all data on Windows XP after blue screen

    - by Barb
    I got a blue screen and was trying to boot with my OS disk. Frankly, I was unsure exactly how to do this. I was trying everything and booted in partition mode. Finally, I booted with disk and ran chkdsk /r and was able to log into Windows. But, all of my files and pictures are gone. I have no backup and all I'm sick to think that I lost the last seven years of pictures of my kids. What can I do?

    Read the article

  • Data Protection Manager System Protection Backups Failing

    - by TrueDuality
    I'm just starting to setup DPM 2010 in a test environment with a Domain Controller and a File Server. Everything seem to be working fairly well and I can get all of my backup jobs to succeed except for the "Computer\System Protection" backups. Both servers are running fully up to date 64 bit Windows Server 2008 R2 Enterprise with Service Pack 1. The error that is being provided is: DPM cannot create a backup because Windows Server Backup (WSB) on the protected computer encountered an error (WSB Event ID: 517, WSB Error Code: 0x8078001D). (ID 30229 Details: Internal error code: 0x809909FB) This Microsoft Knowledge Base article describes the issue perfectly and provides a hotfix. I downloaded the hotfix, moved it onto the affected server, attempt to run it and receive the following error: The update is not applicable to your computer. I've verified that I have indeed downloaded the 64 bit version. According to this thread the hotfix got rolled into Service Pack 1, yet I'm still experiencing the issue. Both machines do have the Windows Server Backup feature installed. Can anybody point me in the right direction? What am I missing?

    Read the article

  • Slow NFS transfer performance of small files

    - by Arie K
    I'm using Openfiler 2.3 on an HP ML370 G5, Smart Array P400, SAS disks combined using RAID 1+0. I set up an NFS share from ext3 partition using Openfiler's web based configuration, and I succeeded to mount the share from another host. Both host are connected using dedicated gigabit link. Simple benchmark using dd: $ dd if=/dev/zero of=outfile bs=1000 count=2000000 2000000+0 records in 2000000+0 records out 2000000000 bytes (2.0 GB) copied, 34.4737 s, 58.0 MB/s I see it can achieve moderate transfer speed (58.0 MB/s). But if I copy a directory containing many small files (.php and .jpg, around 1-4 kB per file) of total size ~300 MB, the cp process ends in about 10 minutes. Is NFS not suitable for small file transfer like above case? Or is there some parameters that must be adjusted?

    Read the article

  • Server hang - data loss on reboot, post mortem analysis

    - by rovangju
    A development server I'm responsible for (ext3 on raid 5 w/Debian Squeeze) froze up over the weekend and I was forced to reset it, as in unresponsive from KVM/physical keyboard access, no eth devices responding, etc. Not even the backup process ran (Figures, the one time I don't check for confirmation) So after the reset, it turns out that every trace of disk IO activity that should have happened for a period of ~24H is completely gone. The log files have a big gap in the dates and times. As if the writes were never committed to disk, no processes seemed to have run. Luckily it was a weekend and nothing of value would have been lost and I don't suspect a hack. What can I do in post mortem to this event - to prevent it from ever happening again? I've seen this happen before on a completely different machine running FreeBSD. I am rounding up the disk checking tools right now - but there must be more going on! Mount options: /dev/sda1 on / type ext3 (rw,errors=remount-ro) Kernel: Linux dev 2.6.32-5-686-bigmem Disk/Inodes: 13%/3%

    Read the article

  • Backup data rate on Raspberry Pi maxing out at 5 Mb/s. Why?

    - by bastibe
    I set up my Raspberry Pi as a Time Machine, as documented here. At the moment, the Raspberry Pi is connected to my MacBook Pro using a direct Ethernet cable. Also, an external hard drive (laptop drive) is connected to the Raspberry Pi using the USB port. However, backups are pretty slow. Activity Monitor claims that the Network is transferring a very steady 5 Mb/s, where my Time Capsule is transferring up to 8 Mb/s with a lot of fluctuation. The Raspberry Pi self-reports (top) that its CPU is only half-used, with about equal parts afpd, usb-storage and jbd2/sda1-8. Thus, I think that the processing power of the Raspberry Pi does not seem to be the problem here. To me, this looks like there is some kind of bottleneck that maxes out at 5 Mb/s thus potentially having my backups run at less than their potential speed. To the best of my knowledge, this might be the afp-daemon, the usb-bus or the external hard drive. So, my question is, how could I identify the true culprit and what can I do about it?

    Read the article

  • Corrupted .WAR file after transfer from 32-64 bit Windows Server to Desktop or vice versa

    - by Albert Widjaja
    Hi All, Does anyone experience this problem of corrupted .WAR file after it has been copied over the network share ? this is .WAR file (Web Archive) the J2EE application file (.WAR file is compressed with the same zip algorithm i think ?) Scenario 1: Windows Server 2008 x64 transfer into Windows XP using RDP client (Local Devices and Resources) Scenario 2: Windows XP 32 bit transfer into Windows Server 2003 x64 using shared network drive (port 445 SMB ?) for both of the scenario it always failed / corrupted (the source code seems to be duplicated at the end of line when you open up in the Eclipse / Java IDE). but when in both scenario i compressed the file into .ZIP file everything is OK. can anyone explains why this problem happens ? Thanks, Albert

    Read the article

  • Wifi connected but no data transfer

    - by Anuj
    I have a Desktop which runs on Windows XP and a laptop which runs in Ubuntu. Recently I have set up a wireless router in order to be able to access internet on my laptop through wifi. The laptop connects to the wifi at ease, but is unable to transfer any data. Only when I switch on my laptop for the first time, it is able to transfer some data only for around 2 mins, after which it shows Destination Host unreachable on pinging the router, and everything stops working, but the wifi still shows to be connected. Please help!

    Read the article

  • Recover data from physically damaged harddrive. What are my options?

    - by Michael Kniskern
    I was trying to replace the power supply in my desktop PC and ended up physically damaging the data connection from the hard drive to the motherboard. The plastic shelf for the copper prongs on the hard drive broke into the cable. Here is a picture of my handy work: I went to Best Buy Geek Squad to discuss my options and they said that they will need to send it to the recover center it could cost anywhere between $250 to $1600 USD to recover the data out the hard drive Is this reasonable for data recovery from a physically damaged hard drive? Are there any other options I can explore? I am going to talk to the data doctors to see what my options are. Update I took the HD to Data Doctors, and they told me that the SATA connection was broken to they would need to replaced the data connector and then copy the data to a brand new hard drive. So, with the initial analysis, cost of replacement parts, and data recovery fee it came out to $865.00 USD. The technician specifically stated if this was an older hard drive that would just need to replace the data connector. But because there is specific information related to the individual hard drive in the flash ROM, they need to transfer the data to a brand new hard drive.

    Read the article

  • Recover data from a corrupted virtualbox vmdk file?

    - by Neth
    The power went out while I was doing a build on a VirtualBox machine, when I restarted the vmdk for the disk the vm was using was corrupted, apparently irrecoverably. I have been able to grep the 66GB vmdk file and it finds strings from the code I was working on that hadn't gotten in to subversion yet (yeah, yeah I know). But the strings are either in the shell history or what look to be strings inside object files. Any ideas for finding/recovering the source code? If it helps the vm was Linux, Fedora Core 10 on an ext3 filesystem. The host is an ubuntu 10.04_amd64 and has an ext4 filesystem.

    Read the article

  • Seasgate GoFlex NAS + Horrible Speed = Bad Experience

    - by Jon H
    I am having issues with transfer speeds from my desktop PC to my NAS. I have my NAS hooked up to a Gigabit Gateway as well as my Desktop with Cat 5e. I see up to 4.0 MB/Second Transfer Rates, the normal is about 2.5 MB/Seconds. There is 3 Partitions on my NAS, Public, Private, Backup. When I transfer from Private to Public I see the speeds above. If its under the same partition almost instant. I was wondering if the speeds I am seeing is in due to my Computer or the NAS. I was looking into building my own Media Server in due to these horrible speeds. Is their anything I can do in the mean time to speed this up? Motherboard = M3970AM-HP (Angelica) Processor = AMD FX 6100 Ram = 10GB PC3-10600 MB/sec Hard Drive (1) = 1.5TB SATA 3.0GB 5400RPM Hard Drive (2) = 120GB SATA SSD NAS = Seagate 3TB Go Flex Home Connection (1) = 1000 Base T Connection (2) = Wireless N

    Read the article

  • Merely chainloading an Acer Recovery Partition deleted all data

    - by WindowsEscapist
    I was starting a backup of Acer's factory restore partition located inside of an extended partition to determine whether or not it still worked. I clicked "take no action" once I saw that it had, in fact, successfully started up. However, when I rebooted, I got an "error: no such partition" and was dropped to a GRUB recovery prompt. Upon further investigation, I discovered that all partitions inside the extended partition were gone except for the recovery partition! What happened? How can I fix this? testdisk doesn't find the deleted partitions!

    Read the article

  • Ways to recover data from external hard drive

    - by Howard Benson
    I use an external hard disk for backup of my mac with time machine (OS 10.5.8). I have made something wrong and I have found important folders in the recycler bin. These folders come from external hd. They are backup folders (backups.backupdb) and others. I have tried to restore them draggin and dropping. Some of them came back in the external hd in a while. For the others it takes hours to "preparing to copy" and then it has said "there's no space to copy" on ext hd. It's strange. Files are now in the recycle bin (180gb), and the ext had should have lot of free space. But it isn't really so. Ext hd is not free of space even if these files are in the bin. I ask for advices. I'm not also able to use time machine now (and i have "lost" old backups) for the same reason. Ext hd says that it has not free space.. Thanks

    Read the article

  • proxy software that supports parallel transfer

    - by est
    Hi guys, I need to setup a really fast proxy server in a remote server, here's the scenario: The server prefetches 3KB of data, mostly HTTP resources. The server send to client 3KB of data, instead of traditional HTTP or SOCKS proxy, the server open multithreaded transfer with 3 connections, send 1KB of data per thread to each connection The client receives 1KBx3, and combine them to the original 3KB data, and return as a local HTTP proxy server. The client display the original data in browser via the local HTTP proxy The latency is not important as long as the transfer rate is good. Does any software like this exist? It's better if it's open source or free ones.

    Read the article

  • Data recovery from corrupt Ubuntu partition/directory (question about a previous answer)

    - by JoshMaurice
    I have an Ubuntu installation that won't boot anymore. I asked my previous question about it here: http://superuser.com/questions/15916/ubuntu-chkdsk-equivalent Bolotov replied: As I see from your previous question you can boot Windows so you could use dskprobe from Windows XP Service Pack 2 Support Tools to make sure that fs type is correct ... but it's already correct fs type 7 is NTFS. Message "The type of the filesystem is RAW. CHKDSK is not available for RAW drives." means that windows can't determine fs type for some reason. As we see fs type is correct. To run Chkdsk on your Windows partition you can install Windows Recovery Console, boot in recovery console and check your disk. After checking the disk you will gain access to you c:\ubuntu\disks. I think you can mount your linux partition (which is in file) as usual loop-back device: mount -o loop [path to your linux-loopback-partition] But you should mount windows patrition first. So now I'd like to know: Within the recovery console I will be issuing the commands "chkdsk -r" and then "mount -o loop [path to windows partition]" and then "mount -o loop c:\ubuntu\disks", correct? I do have a ("corrupt and unreadable") c:\ubuntu\disks directory so that appears to be the correct path to the linux partition; do you know the path to the windows partition? would that be just "c:\"?

    Read the article

  • Remote file copy util (like rsync) but that will take account of data already copied (in this sessio

    - by Rory McCann
    Let's say I have a directory with 2 files, both are identical and quite large (e.g. 2GB ea.) I want to rsync that directory to a remote host. As I understand it (and I could be wrong), rsync calculates checksums of files. Surely if it sees 2 files with the same checksum it can just copy the first file, then do a local copy on the remote host for the 2nd file? That would make it faster, no? On a similar note, doesn't rsync hash all the remote files before copying? If it saw a different file with the same hash as a file that was to transfered, it could do a local copy on the remote host. Does rsync support this sort of thing? Is there some way to turn it on? Is there a tool similar to rsync that will do this sort of 'hash based' local copies?

    Read the article

  • afp/smb transfers caps at 2 megabytes/sec, wireless N

    - by RD.
    I wanted to transfer files between two mac computers. The network is wireless-N and both computers have wireless-N modules in them. The problem is that when I transfer files between them, via file sharing (afp) the network speed caps at 2 megabytes/sec. Just downloading files from the internet I can get faster speeds, so this isn't a constriction of my wifi bandwidth, it appears to be a constriction of the protocol being used. My wifi-n is set to 130mbits, so I should see real world transfer speeds around 12-16 megabytes/sec I did this command on both computers sudo sysctl -w net.inet.tcp.delayed_ack=0 which is supposed to lower tcp overhead, but this did not affect it. How can I get the speed I am expecting?

    Read the article

  • Recover deleted data

    - by atapimp24
    Hi, A user deleted his documents from his laptop somehow and has no backup available. How would one go on his way to recover these deleted files. I have zero experience on this issue. Are there any open source or freeware tools that I can use to attempt a recovery of these files. Thanks

    Read the article

  • Rescue data from damaged hard disk

    - by Lexsys
    Hello. I have a 500 GB hard drive with one NTFS-partition on it. I can mount it with Ubuntu and view the contents. But when I try to copy something, I get an I/O error. Ok, I tried to make its image with dd. I/O error as soon as it starts. I have installed ddrescue, but its manual page says not to use it with drives, failing on I/O. Can I manage to get some information from this drive and how to do this?

    Read the article

  • MySql transfer / update (a bit specific)

    - by Jeff
    before posting I was digging whole site but didn't find help for my problem, so I hope someone will help... Facts: 30 Gb mysql database on remote server (about 20.000.000 rows) data are once weekly updated in local network (mysql) I need to transfer/replace local updated database with remote connection is about 2mb (real mb, not mbps) up/down Point is that I can't have 'down time' of remote mysql server. Until now I Tried: navicat data sync - Ok, but take about 3 days to finish dbForge - ok but need 5 days to finish mysql dump transfer to remote server and execution - about day, but a lot of downtime rsync folder with database /mysql/lib/MY_DATABASE - 4 hours, but after that I need to execute always 'repir on remote server' which takes about 2 hours, and a lot of down time mysql dump piped from cl to directly goto server - still now satisfied many problems I could give you more things that I tried... mysql replication - slow Anyase, what is best,best way to: refresh remote mysql on weekly level and in same time to have 0 sec down time nor huge server load If you have any idea please share

    Read the article

  • Virtual Sd-card won't transfer [migrated]

    - by Hyztname
    I have a Verizon Galaxy Nexus 32GB. I've installed AOKP on it and it has been running pretty okay with no bugs etc... But after I connected it too my Nintendo Wii I think it mounted sdcard as FAT32 itself :\ I'm receiving the following error when trying to download something from 4Shared sync app(But actually none can download or transfer anything to Virtual sdcard): XXXXX.apk: open failed: EACCES(Permission denied) I already restored my old backup wiped all data etc, nothing seems to work. Note that I can access my sd-card, i can't only transfer, take pics... basically data. PS: I showed 4sync problem because it was the only one that specify something

    Read the article

  • SQL SERVER – Configure Management Data Collection in Quick Steps – T-SQL Tuesday #005

    - by pinaldave
    This article was written as a response to T-SQL Tuesday #005 – Reporting. The three most important components of any computer and server are the CPU, Memory, and Hard disk specification. This post talks about  how to get more details about these three most important components using the Management Data Collection. Management Data Collection generates the reports for the three said components by default. Configuring Data Collection is a very easy task and can be done very quickly. Please note: There are many different ways to get reports generated for CPU, Memory and IO. You can use DMVs, Extended Events as well Perfmon to trace the data. Keeping the T-SQL Tuesday subject of reporting this post is created to give visual tutorial to quickly configure Data Collection and generate Reports. From Book On-Line: The data collector is a core component of the Data Collection platform for SQL Server 2008 and the tools that are provided by SQL Server. The data collector provides one central point for data collection across your database servers and applications. This collection point can obtain data from a variety of sources and is not limited to performance data, unlike SQL Trace. Let us go over the visual tutorial on how quickly Data Collection can be configured. Expand the management node under the main server node and follow the direction in the pictures. This reports can be exported to PDF as well Excel by writing clicking on reports. Now let us see more additional screenshots of the reports. The reports are very self-explanatory  but can be drilled down to get further details. Click on the image to make it larger. Well, as we can see, it is very easy to configure and utilize this tool. Do you use this tool in your organization? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL Reporting, SQL Reports

    Read the article

  • Allen for Umbraco with location EXIF meta data

    - by Vizioz Limited
    The latest version of Allen for Umbraco has now hit the Apple App store, we have managed to add some nice improvements to this version that include:Storing location and direction information when photos are taken within the AppEmbedding EXIF data into the images when they are uploadBackground UploadingPull to refresh the media tree Location and DirectionBy default when the camera is used within an application the location and direction that the camera is pointing is not stored within the image meta data. We have now added full support so that this data is now added. We have added a setting which allows you to prevent this data from being uploaded to your website if you do not want the location data to be sent you can turn it off within Allen, Note: Please don't forget that location services do need to be turned on to allow the app to access the images in the phone's asset library.We have had quite a few ideas from users already for using this location data, including logging free parking in Denmark to geo-tagging holiday photos and linking the photos to Google street view. Embedding EXIF dataWe now embed all the meta data available on the iPhone into the image when it is uploaded to your server, this allows you to pull the data out and use it within your site. Have a look at Cultiv's Photo Meta Data package for great example code that allows you to automatically pull this data out and populate properties on your Umbraco media item.We slightly modified the source code of this package to allow the package to always extract the image data, as the default package requires a property to allow the data to be extracted, it's an easy change, if you get stuck add a comment to this post. Background UploadingIf you try to upload multiple images and need to start doing something else on your phone, you can now click the home button and the application will continue to upload your images in the background. As soon as it has finished you will receive a standard Apple notification. Pull to RefreshOur final enhancement has been to add "Pull to refresh" to the media trees, just pull the tree downwards with your finger and it will refresh, this is useful if you are adding items to your media tree while testing your site with Allen for Umbraco. Future enhancements.. your ideas?If you have any ideas for future enhancement feel free to add a comment below!

    Read the article

  • Which data structure should I use for dynamically generated platforms?

    - by Joey Green
    I'm creating a platform type of game with various types of platforms. Platforms that move, shake, rotate, etc. Multiple types and multiple of each type can be on the screen at once. The platforms will be procedural generated. I'm trying to figure out which of the following would be a better platform system: Pre-allocate all platforms when the scene loads, storing each platform type into different platform type arrays( i.e. regPlatformArray ), and just getting one when I need one. The other option is to allocate and load what I need when my code needs it. The problem with 1 is keeping up with the indices that are in use on screen and which aren't. The problem with 2 is I'm having a hard time wrapping my head around how I would store these platforms so that I can call the update/draw methods on them and managing that data structure that holds them. The data structure would constantly be growing and shrinking. It seems there could be too much complexity. I'm using the cocos2d iPhone game engine. Anyways, which option would be best or is there a better option?

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >