Search Results

Search found 59060 results on 2363 pages for 'dummy data'.

Page 172/2363 | < Previous Page | 168 169 170 171 172 173 174 175 176 177 178 179  | Next Page >

  • Need decent undelete utility for Windows

    - by Michael Chermside
    So I get the phone call from a friend or relative or client: "Looks like I accidentally deleted 4 years worth of accounting data. Oh, and I don't have any backups. Can you help?" What I need is a good Windows tool (for NTFS I think) that scans blocks marked as deleted and attempts to recover the file. And I need one that can be run from a thumb drive or since installing new software has a decent chance of overwriting the data before it can be recovered. Googling for such tools turns up mostly carefully-crafted advertisements for moderately expensive products with a "Free! Demo! Version!" which is not exactly reassuring. Any recommendations for a good tool?

    Read the article

  • Forgot to unmount/eject external hard drive, lost moved files. Mac OS X

    - by balupton
    So I was using my Mac with my external hard drive connected via USB. I moved about 10 GB of data to it (via drag and drop while holding down the Command key to move the files rather than to copy them). They moved to the drive all right, but as I was having some issues and the Finder crashed after the transfer, I was unable to eject the volume and later everything froze so I had to do a hard restart (hold the power button). When I remounted the volume (plugged the external hard drive back in) it no longer had any of the files which I moved onto it. As it was a lot of data, how can I recover these files?

    Read the article

  • Creating svn repo programmatically from a webpage and sudo

    - by Adriano Varoli Piazza
    We want to automate the creation of the svn repos and trac environments for new projects. Basically, this would mean creating a web script that got some info (like env and repo name, etc) from the user and then executed sudo -u svn svnadmin create /var/svn/<projectname> trac-admin /var/trac/sites/<projectname> initenv [... All extra params...] For the second command, this is simple, as it already runs as the www-data user, so I wouldn't have to use sudo. But for the first command, I'd have to use sudo and add www-data to the sudoers file. I was wondering if this is a good idea, and how to do it in that case. Reading the manpage has left me with more doubts than certainties about this. This webserver would only be accessible from our internal network, by the way. The OS is Ubuntu Server 10.04.

    Read the article

  • Building a Data Warehouse

    - by Paul
    I've seen tutorials articles and posts on how to build datawarehouses with star and snowflakes schemas, denormalization of OLTP databases fact and dimension tables and so on. Also seen comments like: Star schemas are for datamarts, at best. There is absolutely no way a true enterprise data warehouse could be represented in a star schema, or snowflake either. I want to create a database that will server for reporting services and maybe (if that isn't enough) install analisys services and extract reports and data from cubes. My question was : Is it really necesarry to redesign my current database and follow the star/snowflake schemas with fact and dimension tables ? Thank you

    Read the article

  • HP Rapid Deployment - Change Data Store Path

    - by David Carreyette
    I am running HP Rapid Deployment (Altiris eXpress Deployment Server 6.9 - Build 164) on an inherited Windows Server 2003 SP2. I need to change the Data Store Path as the default is pointing to the C: drive and there is not enough space there. I would like to set it to the D: drive where there is plenty of space. Reading the documentation: Data store path: Specifies the path to stored packages and files and other DS functions (such as license verification). The default path is C:\Program files\Altiris\express\Deployment Server. Note: Do not use this setting to change the path to the Deployment Share. Modifying this setting does not automatically allow you to use another shared directory other than the express share. To change the Deployment Share shared directory, run a Custom install to establish another location for the Deployment Share. Is there any other way I can change the path as I do not have the install media?

    Read the article

  • How does the data storage work? [closed]

    - by Andres Adhi
    I am really new to the whole concept of Data storage, Domain, Server and everything else related to this. Can someone pleases explain what a Domain is? How are server part of the Domain and How are Database stored in the Server or Domain? How does a new server be able to connect to existing database server to get all the data needed. I tried to find this information in the web but I am not really finding a good resource. It may be because these is really basic information. I will really appreciate if someone can explain these concept in plain terms. Thanks in advance.

    Read the article

  • How to recover data from software RAID 5 disk partition

    - by Ali n
    I have CentOS 5.8 on my computer, with 5x 1TB hard drives. I used software RAID. (RAID 1 as a boot partition md0, RAID 0 as a root partition md1 and RAID 5 as /home partition md3). Unfortunately one of these hard drives failed lately and I want to replace it with a new one. I want to know that is it possible to change this hard drive without data loss? The important partition is RAID 5 so in theory if one of hard drives failed I should be able to recover its data without any problem. But in practice how can I do that?

    Read the article

  • Backing up data in an encrypted way

    - by Eli Bendersky
    I have the following use case: There's some data from my PC I want to periodically back-up online I own some hosting, so I want to use that for the backups, don't want to pay to another backup service I want to encrypt my data locally prior to moving it to the server I have no problem writing scripts to automate the process (say, periodically generate the backup and upload by FTP to my server), but my main question is about step 3 - the encryption: which way is recommended to encrypt my files (say, collected into a .ZIP) prior to uploading to the server? P.S. TrueCrypt seems popular but it's not quite what I'm looking for, since I don't want the files to be constantly encrypted here on my PC.

    Read the article

  • How do I back up my Windows partition from an Ubuntu live CD?

    - by lalli
    My Windows partition (C:) is corrupt. I'm booting up from an Ubuntu live CD and trying to copy all the files from C: to my external drive, but the system expands all of the links, producing a projected copy size of 1.8TB (my external drive is just 1TB, and the data in c: is around 700MB). Then I looked at dd and other backup utilities. Anything I looked into, I couldn't figure out whether or not the image would be readable in Windows through any other app. Has anyone else tried to back up data from a corrupted Windows installation using Ubuntu?

    Read the article

  • Excel 2007: Exporting more than 100 columns to a .prn file but data is concatenated

    - by Don1
    I want to export an Excel worksheet to a space delimited (.prn) file. The worksheet is pretty big (187 columns) and when I set the column widths and try to export the worksheet to a .prn file, the data gets cut at the 98th column (i.e. about 200 characters wide for my data) and the rest is placed directly underneath. It's like I ripped a page in half from top to bottom and placed the right-hand side directly under the left-hand side. How would I get it to export everything without getting concatenated?

    Read the article

  • Fast, reliable data transfers from/to China

    - by Nils
    We are a small company and we will need to transfer rather large amounts of data (10GB+ each time) between Europe and China in the near future. As many may have experienced, Internet connections to or from China can be rather unreliable and slow at times without any apparent reason. For example, while sending data to China via FTP generally works well, it can be painfully slow in the other direction. Currently, we are investigating new ways to have high transfer rates in both directions. So far we have tried: FTP (see above) FTP over VPN services (generally slower than direct connections) F2F (like Retroshare or Freenet - slow!!) Aspera (fast but expensive!) BitTorrent (unreachable end nodes, b/c of firewalls which we must not configure) We would like to try: Cloud storage (e.g. Amazon S3, Google Storage) - are those services always and reliably reachable from inside China? Point-to-Point VPN (currently not possible, b/c of the network, see above) I'd be especially grateful to hear from people who have already dealt with this kind of problem before.

    Read the article

  • Manually "draw" data for chart, output to CSV

    - by Ambidex
    I need a service that will allow me to draw a chart line by hand and generate data points for what I drew. This might sound crazy, but I need some data (preferably in CSV output) that will only approximately show value X at time Y and I do not want to go and produce these values by hand. I only have to know how it will flow along. Anyone know how to actually accomplish this? So, I would actually want to draw a line on a graph, and then get the output (X + Y) from that line I drew in a (preferably) CSV.

    Read the article

  • Windows Server 2008 Software Raid 5 - Data integrity issues

    - by Fopedush
    I've got a server running Windows Server 2008 R2, with a (windows native) software raid-5 array. The array consists of 7x 1TB Western Digital RE3 and RE4 drives. I have offline backups of this array. The problem is this: I noticed a few days ago after copying a large file to the disk that there was an integrity issue with that file - it was a ~12GB file that I had downloaded via uTorrent. After moving it to the raid array, I used uTorrent to relocate the download location, and performed a re-check so I could seed it from that location. The recheck found that only 6308/6310 chunks of the copied file were intact. My next step was to write a quick powershell script that would copy files to the array, while performing a SHA1 hash of the original and resultant files and comparing them. Smaller files (100-1000MB) copied over just fine. When I started copying larger data (~15GB), I found that the hash check failed about 2/3rds of the time. The corrupt files had very, very small inconsistencies - less than .01%. I further eliminated the possibility of networking or client issues by placing this large file on the C:\ of the server, and copying it repeatedly from there to the array, seeing similar results. Copying the data via explorer, powershell, or the standard windows command prompt yield the same results. None of the copies fail or report any problems. The raid array itself is listed as healthy in disk management. After a few experiments, I shut down the server and ran memtest overnight. No errors were detected. A basic run of chkdsk found no problems, but I did not use the /R flag, as I was unsure how that might affect a software raid-5 volume. I next ran Crystal Disk Info to check the smart data on the drives - but found that CDI only detected 5 out of 7 of the disks in the array. I have no idea why. Nevertheless, CDI shows the following "caution" flags on a single one of the drives: 05 199 199 140 000000000001 Reallocated Sectors Count C5 200 200 __0 000000000001 Current Pending Sector Count Which is a little bit alarming, but I don't really know what to do with the information. I hardly feel like one reallocated sector could be causing this. At this point, I'm looking for some guidance on what to do next. I need to determine the cause of this issue, but I'm hesitant to run chkdsk /R or any bootable disk health checkers because I'm afraid they might break the array. I've considered triggering a re-sync of the array, but I'm not actually sure how to do that without doing something silly like manually dropping a disk and then restoring it. Any advice that could help me ferret out the precise cause of this issue would be greatly appreciated.

    Read the article

  • Making a hidden truecrypt volume with existing data

    - by Bill Grey
    I have a 1TB hdd, which I would like to encrypt. I would like to make a hidden volume, with almost nothing within but some decoy data, and the rest in a hidden volume. However, my driver is over 95% full. Is it still possible to do this, or would it have to be done on an empty drive, and then copy the data over? I could not find the answer to this question in the documentation. Also, how easy would it be to undo, or unencrypt the drive? Would it again need another empty drive to begin with?

    Read the article

  • What caused Cyclic Redundancy Check error on my BitLocker-enabled external hard drive and how to fix it?

    - by Forgiver
    I have an external hard drive that is connected to my computer (Windows 7 Ultimate 64 bit) via a USB port. BitLocker has been activated on the hard drive to protect the data. Yesterday I started downloading a huge data from the internet but now it failed to complete because of CRC error. I tried to repair it with chkdsk f:/r where f is the drive letter of that hard drive. There are many messages reporting "unreadable" as follows: What caused Cyclic Redundancy Check error on my BitLocker-enabled external hard drive and how to fix it?

    Read the article

  • Is it dangerous to use both Sky Drive and Dropbox?

    - by Matthew
    I'd like to experiment with Sky Drive, but keep using my Dropbox account unless I decide to switch. This answer gives instructions for how to set up both at the same time, but I'm a little worried about data integrity. Is there any danger involved here? Will Sky Drive and Dropbox fight each other? Note that I am using Sky Drive/Dropbox on multiple computers, so they will be writing data as well as reading it. Is this safe? Edit: I can use them with different folders if necessary, but I'm particularly curious what would happen if they sync from the same folder.

    Read the article

  • split large text file without missing data record

    - by Santosh
    I have a 140 MB text file, which contains detail information of books in library. For each book details there is a standard format data details in text file. I need to parse it and insert the data in Database. Here, parsing text file is not an issue. I am facing problem in parsing this large file. So i decided to split the file in small file around 2 MB each file. But i can't manually split this large file in so many pieces. I got HJsplit tool, which split the file but this also doesn't helped as this split the file but 1 book details half part is in one file and rest part is in second file. so if i split this way then information will be missed. How to split the large so that i cant miss the information ? Is there any tool which help me in this condition.

    Read the article

  • Filter data in sheets from a master sheet

    - by sam
    I have a 'master sheet' with lots of furniture data in it, in column A there are the suppliers names. What I would like is to be able to have my master sheet with all the info and then sub sheets named by supplier; in these sub sheets I would like to reference the master sheet and pull out all of the items that are from that supplier. For example: I would have a sheet called 'Ikea' which would look in the master sheet and search the A column for all entries of 'Ikea'. If present, copy or reference that row 1:12 in the 'ikea' sheet. I would like to do it all dynamically using references rather than copying the data. Also, I would like it to auto update rather than having to run a macro to recalculate it each time. Can this be done with formulars rather than macros?

    Read the article

  • How can I read a reel-to-reel tape from the 1970s?

    - by Joe Wreschnig
    A close friend of my mother worked at DEC in the 1970s and 1980s. She recently passed away, and in sorting through her estate, my mother discovered some reel-to-reel magnetic tape. We are curious about what might be on it. I haven't yet seen a picture of it, but Wikipedia tells me this is most likely DECtape. Is there any chance the data on it is still good? It was not preserved with great care, but as far as we know it has also never been particularly abused. Just left in a box and moved a few times. If the data is still valid, do we need to dig up a PDP or VAX or read it, or is there a more modern option?

    Read the article

  • Shared volume for data (multiple MDF) and another shared volume for logs (multiple LDF)

    - by hagensoft
    I have 3 instances of SQL Server 2008, each on different machines with multiple databases on each instance. I have 2 separate LUNS on my SAN for MDF and LDF files. The NDX and TempDB files run on the local drive on each machine. Is it O.K. for the 3 instances to share a same volume for the data files and another volume for the log files? I don't have thin provisioning on the SAN so I would like to not constaint disk space creating multiple volumes because I was adviced that I should create a volume (drive letter) for each instance, if not for each database. I am aware that I should split my logs and data files at least. No instance would share the actual database files, just the space on drive. Any help is appretiated.

    Read the article

  • SFTP: How to keep data out of the DMZ

    - by ChronoFish
    We are investigating solutions to the following problem: We have external (Internet) users who need access to sensitive information. We could offer it to them via SFTP which would offer a secure transport method. However, we don't want to maintain the data on server as it would then reside in the DMZ. Is there an SFTP server that has "copy on access" such that if the box in the DMZ were to be compromised, no actual data resided on that box? I am envisioning an SFTP Proxy or SFTP passthrough. Does such a product exist currently?

    Read the article

  • how to warehouse data that is not needed from sql server

    - by I__
    I have been asked to truncate a large table in sql server 2008. The data is not needed but might be needed once every two years. It will NEVER have to be changed, only viewed. The question is, since I don't need the data on a day-to-day basis, what do I do with it to protect and back it up? Please keep in mind that I will need to have it accessible maybe once every two years, and it is FINE for us if the recovery process takes a few hours. The entire table is about 3 million rows and I need to truncate it to about 1 million rows.

    Read the article

  • Windows XP - non-user input data filter message after installing wireless keyboard & mouse

    - by James
    After I installed MS wireless keyboard and mouse and associated software, I started getting this annoying message titled "Hardware installation" telling me the software I am trying to install did not pass the XP logo tests. The software is for "HID non-user input data filter" and I have two options Continue anyway or stop installation. Now, if I try to continue the installation fails, if stop installing another message pops up with a little mouse logo and the whole process repeats itself. after I am done with that message a third dialog appears. This is happening every time I boot up my PC (a desktop), I tried following an advice I found in some forum and download windows update for ID non-user input data filter, but that installation failed as well. The thing is, that both keyboard and mouse are working fine Is there anyway to get past these dialogs ?

    Read the article

  • Do any filesystems support multiple forks / streams on directories?

    - by hippietrail
    Apple's HFS+ supports multiple forks such as the old data and resource forks. NTFS supports alternate data streams. I believe some *nix filesystems also have some support for multiple file forks or streams. Given that directories (folders) are just a kind of file at the filesystem level, I'm wondering if any of the filesystems which support this feature support it for dirs as well as files? (Or indeed directories in the alternate forks / streams?) I'm mostly asking out of curiosity rather than wanting to use such a feature. But one use it would have would be additional metadata for directories, which seems to be the most common use for these streams for files currently.

    Read the article

  • scrape data from a website and post it on the blog (wordpress)

    - by Pennf0lio
    This could be in DocType But I'm looking for a software or just a plugin for wordpress. I wanted to fetch those data from a website and automatically post it on my blog (Wordpress powered). It doesn't have rss or api to get those data, so I need to manually copy and paste it one-by-one and post it on wordpress. Do you know an alternative options on my process? or you know a software or a plugin that does the job? Thanks!

    Read the article

< Previous Page | 168 169 170 171 172 173 174 175 176 177 178 179  | Next Page >