Search Results

Search found 96184 results on 3848 pages for 'recent file list'.

Page 206/3848 | < Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >

  • Need to get a list of all users within a subnet of servers

    - by mikedopp
    I am looking to write a batch or vbs script to gather all users (local to the server. ie. administrators or a local account(not ad users)) on a collection of servers inside my network. I assume I could do this by subnet. Could even put the server names into a csv text file for the script to read from and report back to. Lots to ask. I would use net user however I run into local access only. Ideas? Or too many security walls to work?

    Read the article

  • Does this exist: a standardized way of documenting a file-system structure

    - by eegg
    At work, I'm in charge of maintaining the organization of a whole lot of varied data on a standard file-system. Part of this is coming up with sensible classification (by similarity, need, read/write access, etc), but the bigger part is actually documenting it: what documents/files/media should go where, what should not be in this directory, "for something slightly different, see ../../other-dir", etc. At the moment, I've documented this using a plaintext file filing.txt in every directory I want to document. If someone is unsure what's meant to be in any directory, they read that file. This works alright, but it seems odd that I have this primitive custom solution to a problem that any maintainer of a non-trivial directory structure must experience. Every company I've known of, for example, has some kind of shared file-system where agreed terminology for categorization is important. In my experience, people just have to learn what's what by trial-and-error and experimentation. So allow me to propose a better solution, and hopefully you can tell me if it exists. Any directory on any filesystem can have a hidden plaintext file named .filing. Its contents are descriptive human language. It uses some markup like Markdown, with little more than bold, italic, and (relative) hyperlinks to other directories. Now a suitably-enabled file browser will check for a file named .filing whenever it displays a directory. If it exists, its contents are parsed and displayed in an unobtrusive pane near the directory-path widget. Any links therein can be clicked, and the user will be taken to the target directory of that link. I think that the effort of implementing such a standard would pay back many times over in usability gains. We would have, say, plugins for Nautilus, Konqueror, etc.. It could be used to display directory information in the standard file lists served by webservers. And so on. So, question: does such a thing exist? If not, why not? Do people think it's a worthwhile idea?

    Read the article

  • haproxy access list using path_dir having issues with firefox

    - by user11243
    I'm trying to route all requests containing a path directory of /socket.io/ to a separate port with HAProxy. Here is my config file: global maxconn 4096 # Total Max Connections. This is dependent on ulimit nbproc 2 defaults mode http frontend all 0.0.0.0:80 timeout client 86400000 default_backend web_servers acl is_stream path_dir socket.io use_backend stream_servers if is_stream backend web_servers balance roundrobin option forwardfor # This sets X-Forwarded-For timeout server 30000 timeout connect 4000 server web1 127.0.0.1:4000 weight 1 maxconn 1024 check backend stream_servers balance roundrobin option forwardfor # This sets X-Forwarded-For timeout queue 5000 timeout server 86400000 timeout connect 86400000 server stream1 127.0.0.1:5100 weight 1 maxconn 1024 check URL paths with a /socket.io/ get correctly directed to port 5100 in chrome and safari. However not for firefox. I'm running Haproxy locally on my mac for dev, not sure if it has anything to do with it. I'm using haproxy 1.4.8 and Firefox 3.6.15. I've tried clearing cache on firefox and it didn't help, so I'm thinking there's something wrong with the way HAProxy parses through the Firefox request headers.

    Read the article

  • Phantom Local Disks appearing in my drive list

    - by Paul
    I seem to have several phantom Local Disks mapped to different letters that are of 0 bytes in size. Strangely, they do not show up when I view my drives through Windows Explorer. But if I open an application such as ACDSee Pro or MS Word and then go to open a file I can see all these Local Disks mapped to different letters. This means when I plug in my external hard disk it ends up mapped to letter R instead of its usual G which messes up any programs I have pointing to it by default. How did they get there and more importantly, how do I get rid of them? I'm on a Window 7 Home Premium 32 bit machine.

    Read the article

  • Adding file to /etc/cron.d doesn't make it run (ubuntu 10.04)

    - by tom
    If I scp a cron file into /etc/cron.d it doesn't run unless I edit the file and change the command. Then crond seems to pick up the cron file. How can I make cron reload its cron files in ubuntu 10.04? 'touch'ing the file doesn't work nor does 'restart cron' or 'reload cron'. My cron file is set to run every minute and logs to a file. Nothing ends up in the log file until I edit the command, and there's no entry for it in /var/log/syslog I'm stumped. Here's my cron file saved to /etc/cron.d/runscript # Runs the script every minute. This is safe because it will exit with success if it's already running * * * * * www-data if [ -f /usr/local/bin/thing ]; then exec /usr/bin/php /usr/local/bin/thing mode:prod -a 14 -d >> /var/log/thing/mything.log 2>&1; else echo `date +'[%D %T]'` "Thing not deployed. Command not run\n" >> /var/log/thing/mything.log; fi &

    Read the article

  • zsh : How to list directory content with tab?

    - by Philippe CM
    I just switched from BASH to ZSH and thing are pretty good, but: when I start typing cd /usr/share/s and hit TAB, this is what I get : $ cd /usr/share/sane/ sane/ skype/ ssl-cert/ screen/ smplayer/ strigi/ seed-gtk3/ snmp/ synaptic/ sgml/ software-properties/ system-config-printer/ sgml-base/ soprano/ sysv-rc/ sgml-data/ sounds/ simple-scan/ splashy/ And this is ok. If I then hit TAB again, I get $ cd /usr/share/screen/, the next candidate, witch is also OK. (BTW, how do I cycle back to the previous candidate? Sorry, on to my question) Now what if I want to see the contents of /usr/share/screen/ now ? You now, BASH-style? The cursor is at the end of the line, will I have to ctrl-a (or home), then del del (to erase cd) then ls then ENTER? That seems like a lot of typing. (And it - possibly unnecessarily - enters the command in the history) Would not there be a key (maybe modifier-TAB? but the obvious candidates are already taken by the desktop... I digress) that would tell zsh to stop cycling through /usr/share/ and instead, just list the content /usr/share/screen/ ?

    Read the article

  • Huge discrepancy in Inkscape file size

    - by Keyran
    When using Inkscape to create many pictures with common elements across them, I tend to copy the first SVG file I have created as many times as I need pictures, and then edit the copies. If I reuse files across projects, it can result in a file being copied and modified tens to hundreds of files. I have recently realized that the latest copies have a size between 29 and 60 MB, slowing my computer down significatively. My pictures are very simple, nothing that would normally go over 1 MB in size. As an experiment, I copied the entire content of one of the latest files into a new Inkscape file. I am certain that I have copied the content of the file entirely (I have only one layer and I used the "Select All" option). The new file has a size of 102,2 KB. This would indicate that about 30 MB of data per file is irrelevant to me. What could be the cause of this size difference ? Is there a way to reduce the size of a file without having to copy the content into a new file ? I am using Inkscape 0.48.4 on Debian Unstable. Thanks for any input you might be able to provide !

    Read the article

  • BKF file corruption

    - by Naitik Semwaal
    I don't wanna ask anything here as I have nothing to ask. Instead of that, if I share some useful info here, would you guys mind? If not, then let me proceed. You must have heard about "Back up", the process in which we create backup copies of our crucial data into a file, called BKF (backup) file. Having a valid BKF file, provides security to our data against unwanted data loss or corruption. Whenever such a critical situation takes place, we can restore our BKF file and get our data back (but only if backed up earlier). Do you guys ever thought that why a BKF file gets corrupted? What could be the reasons which make the BKF file corrupted or inaccessible? One day while googling, I found a blog post named as: Reasons of BKF file corruption. I read it, it was very informative. In this blog, I came to know about the reasons for corruption in BKF files. I shared the blog here so that users can read it and clear their doubts of BKF file corruption. I hope this would be helpful.

    Read the article

  • Business rule validation of hierarchical list of objects ASP.NET MVC

    - by SergeanT
    I have a list of objects that are organized in a tree using a Depth property: public class Quota { [Range(0, int.MaxValue, ErrorMessage = "Please enter an amount above zero.")] public int Amount { get; set; } public int Depth { get; set; } [Required] [RegularExpression("^[a-zA-Z]+$")] public string Origin { get; set; } // ... another properties with validation attributes } For data example (amount - origin) 100 originA 200 originB 50 originC 150 originD the model data looks like: IList<Quota> model = new List<Quota>(); model.Add(new Quota{ Amount = 100, Depth = 0, Origin = "originA"); model.Add(new Quota{ Amount = 200, Depth = 0, Origin = "originB"); model.Add(new Quota{ Amount = 50, Depth = 1, Origin = "originC"); model.Add(new Quota{ Amount = 150, Depth = 1, Orinig = "originD"); Editing of the list Then I use Editing a variable length list, ASP.NET MVC 2-style to raise editing of the list. Controller actions QuotaController.cs: public class QuotaController : Controller { // // GET: /Quota/EditList public ActionResult EditList() { IList<Quota> model = // ... assigments as in example above; return View(viewModel); } // // POST: /Quota/EditList [HttpPost] public ActionResult EditList(IList<Quota> quotas) { if (ModelState.IsValid) { // ... save logic return RedirectToAction("Details"); } return View(quotas); // Redisplay the form with errors } // ... other controller actions } View EditList.aspx: <%@ Page Title="" Language="C#" ... Inherits="System.Web.Mvc.ViewPage<IList<Quota>>" %> ... <h2>Edit Quotas</h2> <%=Html.ValidationSummary("Fix errors:") %> <% using (Html.BeginForm()) { foreach (var quota in Model) { Html.RenderPartial("QuotaEditorRow", quota); } %> <% } %> ... Partial View QuotaEditorRow.ascx: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<Quota>" %> <div class="quotas" style="margin-left: <%=Model.Depth*45 %>px"> <% using (Html.BeginCollectionItem("Quotas")) { %> <%=Html.HiddenFor(m=>m.Id) %> <%=Html.HiddenFor(m=>m.Depth) %> <%=Html.TextBoxFor(m=>m.Amount, new {@class = "number", size = 5})%> <%=Html.ValidationMessageFor(m=>m.Amount) %> Origin: <%=Html.TextBoxFor(m=>m.Origin)%> <%=Html.ValidationMessageFor(m=>m.Origin) %> ... <% } %> </div> Business rule validation How do I implement validation of business rule: Amount of quota and sum of amounts of nested quotas should equal (e.a. 200 = 50 + 150 in example)? I want to appropriate inputs Html.TextBoxFor(m=>m.Amount) be highlighted red if the rule is broken for it. In example if user enters not 200, but 201 - it should be red on submit. Using sever validation only. Thanks a lot for any advise.

    Read the article

  • Split a Large File In C++

    - by wdow88
    Hey all, I'm trying to write a program that takes a large file (of any time) and splits it into many smaller "chunks". I think I have the basic idea down, but for some reason I cannot create a chunk size over 12,000 bites. I know there are a few solutions on google, etc. but I am more interested in learning what the origin of this limitation is then actually using the program to split files. //This file splits are larger into smaller files of a user inputted size. #include<iostream> #include<fstream> #include<string> #include<sstream> #include <direct.h> #include <stdlib.h> using namespace std; void GetCurrentPath(char* buffer) { _getcwd(buffer, _MAX_PATH); } int main() { // use the function to get the path char CurrentPath[_MAX_PATH]; GetCurrentPath(CurrentPath);//Get the current directory (used for displaying output) fstream bigFile; string filename; int partsize; cout << "Enter a file name: "; cin >> filename; //Recieve target file cout << "Enter the number of bites in each smaller file: "; cin >> partsize; //Recieve volume size bigFile.open(filename.c_str(),ios::in | ios::binary); bigFile.seekg(0, ios::end); // position get-ptr 0 bytes from end int size = bigFile.tellg(); // get-ptr position is now same as file size bigFile.seekg(0, ios::beg); // position get-ptr 0 bytes from beginning for (int i = 0; i <= (size / partsize); i++) { //Build File Name string partname = filename; //The original filename string charnum; //archive number stringstream out; //stringstream object out, used to build the archive name out << "." << i; charnum = out.str(); partname.append(charnum); //put the part name together //Write new file part fstream filePart; filePart.open(partname.c_str(),ios::out | ios::binary); //Open new file with the name built above //Check if near the end of file if (bigFile.tellg() < (size - (size%partsize))) { filePart.write(reinterpret_cast<char *>(&bigFile),partsize); //Write the selected amount to the file filePart.close(); //close file bigFile.seekg(partsize, ios::cur); //move pointer to next position to be written } //Changes the size of the last volume because it is the end of the file else { filePart.write(reinterpret_cast<char *>(&bigFile),(size%partsize)); //Write the selected amount to the file filePart.close(); //close file } cout << "File " << CurrentPath << partname << " produced" << endl; //display the progress of the split } bigFile.close(); cout << "Split Complete." << endl; return 0; } Any ideas? Thanks!

    Read the article

  • SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video

    - by pinaldave
    Importing data into database is one of the most important tasks. I often receive questions regarding what is the quickest way to insert CSV data or how to import CSV Data into SQL Server Table. Honestly the process is very simple and the script is even simpler. In today’s SQL in Sixty Seconds Video we will learn how quickly we can insert CSV data into SQL Server. The steps to import CSV are very simple. Create Table Use Bulk Insert to import the data Verify the data Done! Absolutely it is that simple. More on Importing CSV Data: SQL SERVER – Import CSV File Into SQL Server Using Bulk Insert – Load Comma Delimited File Into SQL Server SQL SERVER – Import CSV File into Database Table Using SSIS SQL SERVER – Create a Comma Delimited List Using SELECT Clause From Table Column SQL SERVER – Comma Separated Values (CSV) from Table Column SQL SERVER – Comma Separated Values (CSV) from Table Column – Part 2 I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • Can't exec "locale": No such file or directory

    - by Alex
    I am new in Linux. I was trying to install wine and after /i followed instructions from a youtube video i got to the point where I needed to install wine from Ubuntu Software Center. The problem is the Ubuntu Software Center doesn't work anymore, it ask me to reparir it, and when I push the Repair button it gives me this error: installArchives() failed: Can't exec "locale": No such file or directory at /usr/share/perl5/Debconf/Encoding.pm line 16. Use of uninitialized value $Debconf::Encoding::charmap in scalar chomp at /usr/share/perl5/Debconf/Encoding.pm line 17. Preconfiguring packages ... Can't exec "locale": No such file or directory at /usr/share/perl5/Debconf/Encoding.pm line 16. Use of uninitialized value $Debconf::Encoding::charmap in scalar chomp at /usr/share/perl5/Debconf/Encoding.pm line 17. Preconfiguring packages ... Can't exec "locale": No such file or directory at /usr/share/perl5/Debconf/Encoding.pm line 16. Use of uninitialized value $Debconf::Encoding::charmap in scalar chomp at /usr/share/perl5/Debconf/Encoding.pm line 17. Preconfiguring packages ... dpkg: warning: 'ldconfig' not found in PATH or not executable. dpkg: error: 1 expected program not found in PATH or not executable. Note: root's PATH should usually contain /usr/local/sbin, /usr/sbin and /sbin. Error in function: SystemError: E:Sub-process /usr/bin/dpkg returned an error code (2) Please help me. Thank you :D

    Read the article

  • Problem with Mono and .exe file

    - by Vere Nicolson
    I have purchased a piece of software to configure programable radio control transmitters. It says it will run on Linux, see below: Digital Radio runs on: Microsoft Windows 2000/2003/XP Microsoft Windows Vista/Seven/2008, Linux Ubuntu or a distribution with Mono, 32 or 64 bit, also in a virtual machine. Linux requires the Mono package installed, with also the Visual Basic 2005 runtime library. The Linux version is the same executable file of the Windows platform, and can be execute using Mono. You don't need Wine. All the tests have been done on Ubuntu Desktop 10.10 I have tried for weeks to get the drivers for the cable to work in XP or Win7 and I admit defeat. It looks like Ubuntu can run the cable effortlessly but now I can't get the software going. Tried to run in Ubuntu 10.04 with mono, GUI failed and I got the following message in terminal. $ mono ~/Desktop/GigRadioLinux/DigitalRadio/DigitalRadio.exe The entry point method could not be loaded Windows installation requires using a 30 odd character Passkey and a 4.24k text file as a "license" to be entered during running of the exe file. Can someone tell me how I enter the passkey and license into terminal, or is that not my primary problem? I don't understand "entry point method". Tried Wine and that didn't work either. The developer responded to my earlier emails re the cable drivers, but hasn't replied to questions regarding this. If I have left out anything important let me know and I will try to supply more information.

    Read the article

  • Couldn't find package - But package is listed in the Packages file

    - by Chris
    (Quoted items are redacted elements) I am using a private repository and an currently trying to repackage some packages 3rd-party packages. I extract the package, make a few modifications (just the control files to fit with company policy - though sometimes file install locations though not in this case) and repackage (and usually rename). Normally I copy the files into a new blank debhelper project and reconstruct the package, however, with a recent one I attempting to convert and some libraries and stuff aren't linking properly (I did copy the postinst, postrm, and preinst files along with all DEDIAN files exactly), the original package worked, but my repackage doesn't, despite providing the same files in the same locations and the same postinst and preinst. So I was attempting to just modify the current packages control files (as the original package is not very good and will not list in our repository and getting a better one from the 3rd party is not an option). I also renamed the package. I did the following: dpkg-deb -R "directory" Modify DEBIAN/control dpkg-deb -b "directory" "package name I want" I did this and put it in our repository. The package shows up in the "Packages" file on the repository and running apt-get update on the client side shows the package in: /var/lib/apt/lists/"server"_"location"_Packages However when I do an apt-get install on the package name (as listed in the Packages file - I did a copy paste) it says it can't find the package. Same with an apt-cache search The Packages listings is as follow (name redacted): Package: "package name" Priority: extra Section: unknown Maintainer: "maintainer" Architecture: any Version: 1.0-lucid5 Depends: libc Filename: "directory"/"package_filename" Size: 2206292 MD5sum: "md5sum" SHA1: "sha key" SHA256: "sha256 key" Description: "description" I am running as sudo (and tried as root as well). I don't understand why apt-get won't see the package. Can you point out any flaws in what I have done, or perhaps some help on getting apt-get to properly see the package. Or perhaps an alternative. I am not even sure if this is a valid way to repackage something. Thanks.

    Read the article

  • MySQL my.cnf file not being read, Ubuntu 10.04 64bit

    - by reallyordinary
    I've been researching this for a few hours with no luck. Basically it looks like my server's my.cnf file isn't being read at all. I've searched my server, and there's only one my.cnf file on it, located at /etc/mysql/my.cnf. Its ownership is root:root. I'm running Ubuntu 10.04 64bit on a Linode.com server. I have the latest versions of MySQL and PHP installed. I've edited the my.cnf file, commented out "skip-innodb", and have set innodb to be the default storage engine using default-storage-engine = innodb And then restarted mysql. But when I do show engines, MyISAM is still coming up as the default engine. Also - none of the innodb settings I've added to the my.cnf file are being read. For example, I have this in my.cnf: innodb_buffer_pool_size=4G But in phpmyadmin, InnoDB is showing as having a buffer pool size of 8,192 KiB. Similarly, I have this in the my.cnf: innodb_data-file_path = ibdata1:500M:autoextend But in phpmyadmin, it's reading as ibdata1:10M:autoextend. It doesn't look like MyISAM info is being read from the my.cnf file either. The my.cnf file has skip-external-locking queried out, but it's showing as "on" in phpmyadmin. So - yeah, it looks like nothing in the my.cnf file is being read at all. But the server still works. I'm running a Drupal site on it and it seems to operate fine. So mysql seems to be drawing default settings from... some mysterious secret location. Any idea how I can make mysql see and use this my.cnf file? Actually, wait - it looks like it may be being read, not sure. I checked the error.log and found this: 101128 4:28:52 [ERROR] Cannot find or open table databasename/cache_apachesolr from the internal data dictionary of InnoDB though the .frm file for the table exists. Maybe you have deleted and recreated InnoDB data files but have forgotten to delete the corresponding .frm files of InnoDB tables, or you have moved .frm files to another database? or, the table contains indexes that this version of the engine doesn't support. See http://dev.mysql.com/doc/refman/5.1/en/innodb-troubleshooting.html how you can resolve the problem. InnoDB: Error: auto-extending data file ./ibdata1 is of a different size InnoDB: 640 pages (rounded down to MB) than specified in the .cnf file: InnoDB: initial 32000 pages, max 0 (relevant if non-zero) pages! InnoDB: Could not open or create data files. InnoDB: If you tried to add new data files, and it failed here, InnoDB: you should now edit innodb_data_file_path in my.cnf back InnoDB: to what it was, and remove the new ibdata files InnoDB created InnoDB: in this failed attempt. InnoDB only wrote those files full of InnoDB: zeros, but did not yet use them in any way. But be careful: do not InnoDB: remove old data files which contain your precious data! 101128 4:28:52 [ERROR] Plugin 'InnoDB' init function returned error. 101128 4:28:52 [ERROR] Plugin 'InnoDB' registration as a STORAGE ENGINE failed. 101128 4:28:52 [ERROR] /usr/sbin/mysqld: unknown variable 'innodb_lock_wait_timout=50' 101128 4:28:52 [ERROR] Aborting 101128 4:28:52 [Note] /usr/sbin/mysqld: Shutdown complete

    Read the article

  • SQL SERVER – Get File Statistics Using fn_virtualfilestats

    - by pinaldave
    Quite often when I am staring at my SSMS I wonder what is going on under the hood in my SQL Server. I often want to know which database is very busy and which database is bit slow because of IO issue. Sometime, I think at the file level as well. I want to know which MDF or NDF is busiest and doing most of the work. Following query gets the same results very quickly. SELECT DB_NAME(vfs.DbId) DatabaseName, mf.name, mf.physical_name, vfs.BytesRead, vfs.BytesWritten, vfs.IoStallMS, vfs.IoStallReadMS, vfs.IoStallWriteMS, vfs.NumberReads, vfs.NumberWrites, (Size*8)/1024 Size_MB FROM ::fn_virtualfilestats(NULL,NULL) vfs INNER JOIN sys.master_files mf ON mf.database_id = vfs.DbId AND mf.FILE_ID = vfs.FileId GO When you run above query you will get many valuable information like what is the size of the file as well how many times the reads and writes are done for each file. It also displays the read/write data in bytes. Due to IO if there has been any stall (delay) in read or write, you can know that as well. I keep this handy but have not shared on blog earlier. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL View, T SQL, Technology Tagged: Statistics

    Read the article

  • Database Connectivity Test with UDL File

    - by Ben Griswold
    I bounced around between projects a lot last week.  What each project had in common was the need to validate at least one SQL connection.  Whether you have SQL tools like SSMS installed or not, this is a very easy task if you are aware of the UDL (Universal Data Link) files.  Create a new file and name it anything as long as it has the .udl extension. Open the file, choose a provider: Click Next >> or navigate to the Connection Tab to provide connection information.  Once you provide server and login credentials, the database list will populate.  At this point, you know the connection is valid. but go ahead and click the Test Connection button anyway. On the final tab, you can provide extra connection information like Application Name which can come in handy.  The All tab is beneficial if you want to build a valid connection string to include in your own applications.  If you save the file and then open in Notepad, you’ll find that said connection string: Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=master;Data Source=(local);Application Name=TestApp I hope this tip helps save you some time.  How do you test if you don’t have SSMS installed?

    Read the article

  • Ubuntu One Sync as a File Backup Solution?

    - by Jeff
    I was hoping to utilize Ubuntu One and in particular, the syncing feature within Ubuntu One to provide offsite backup for some of my files. My intention was to mark any of my folders that have important files as 'folders to synchronize' to Ubuntu One. It works great in that whenever an important file is placed in the folder, the file is copied up to Ubuntu One (hence creating a backup). However, if any of these important files are lost or accidently deleted from my computer then due to the synchronization it is also immediately deleted from Ubuntu One. This approach does not work very well to provide backup. On one hand I really like the automatic way in which the synch feature will upload any of my important files to Ubuntu One but on the other hand if I lose the file on my computer it will likely be taken off of the cloud as well (via synchronization). What approach are others taking to backup their important files to Ubuntu One? I didn't want to have to manually upload my important files to Ubuntu One and remember to upload other important files as they are created on my computer. Your thoughts and suggestions are greatly appreciated.

    Read the article

  • Changes to File Store Provider in UCM PS3

    - by Kevin Smith
    In the recent PS3 release of UCM (11.1.1.4.0) there are some significant changes to the File Store Provider (FSP) configuration. For new PS3 installs (not upgrades from PS2) the FSP default storage rule includes a dispersion rule that will change the web-layout and vault paths by adding dispersion directories to the paths to limit the number of files in the vault and web-layout directories. What that means is that if you install a new PS3 UCM instance and migrate content in from a previous version of UCM the web URL will change. That is a critical problem for web sites and just general document management. See below for some details on the FSP configuration in PS3 and how you can change the default behavior. use the link below to read the rest of this post where I describe the issue in detaill and provide instructions for how to modify a PS3 instance to use the old format for the web-layout path.

    Read the article

  • Why can't GoogleBot forget a 404 File Does Not Exist?

    - by Sam
    Hi folks, For two months now, googlebot is trying to get a file which does not exist anymore. This is just one example out of many: I had renamed the file to a better name and removed the old file. Now, why does Google insist on getting a file which it already has seen does not exist for months? doesn't it just give up and get on being a happy bot? My Error log file is filled with these repeating lines all wanting to get that one file, although they know for already long time that its not there: What to do in these situations? Are there automatic redirection rules that handle via 301 to the home page? [Sat Mar 05 01:55:41 2011] [error] [client 66.249.66.177] File does not exist: /var/www/vhosts/website.org/httpdocs/extraNeus.php [Sat Mar 05 01:58:20 2011] [error] [client 66.249.66.177] File does not exist: /var/www/vhosts/website.org/httpdocs/extraNeus.php [Sat Mar 05 02:03:57 2011] [error] [client 66.249.66.177] File does not exist: /var/www/vhosts/website.org/httpdocs/extraNeus.php on and on ... and on...

    Read the article

  • Unable to mount external hard drive - Damaged file system and MFT

    - by Khalifa Abbas Lame
    I get the following error when i try to mount my external hard drive. UNABLE TO MOUNT Error mounting /dev/sdc1 at /media/khalibloo/Khalibloo2: Command-line `mount -t "ntfs" -o "uhelper=udisks2,nodev,nosuid,uid=1000,gid=1000,dmask=0077,fmask=0177" "/dev/sdc1" "/media/khalibloo/Khalibloo2"' exited with non-zero exit status 13: ntfs_attr_pread_i: ntfs_pread failed: Input/output error Failed to read of MFT, mft=6 count=1 br=-1: Input/output error Failed to open inode FILE_Bitmap: Input/output error Failed to mount '/dev/sdc1': Input/output error NTFS is either inconsistent, or there is a hardware fault, or it's a SoftRAID/FakeRAID hardware. In the first case run chkdsk /f on Windows then reboot into Windows twice. The usage of the /f parameter is very important! If the device is a SoftRAID/FakeRAID then first activate it and mount a different device under the /dev/mapper/ directory, (e.g. /dev/mapper/nvidia_eahaabcc1). Please see the 'dmraid' documentation for more details. It doesn't mount on windows either: "I/O Device error" it's an ntfs hard drive with a single partition Of course, i tried chkdsk /f. it reported several file segments as unreadable, but didn't say whether it fixed them or not (apparently not). also tried with the /b flag. ntfsfix reported the volume as corrupt. TestDisk was able to fix a small error with the partition table by adding the "80" flag for the active (only) partition. TestDisk also confirmed that the boot sector was fine and it matched the backup. However, when attempting to repair the MFT, it couldn't read the MFT. It also couldn't list the files on the hard drive. It says file system may be damaged. Active@ also shows that MFT is missing or corrupt. So how do i fix the file system? or the MFT?

    Read the article

  • Save password in WCF adapter binding file

    - by Edmund Zhao
    Binding file for WCF Adapter doesn't save the password no matter it is generated by "Add Generated Items..." wizard in Visual Studio or "Export Bindings..." in administration console. It is by design dut to the consideration of security, but it is very annoying especially when you import bindings which contain multiple WCF send ports. The way to aviod retyping password everytime after an import is to edit the binding file before import. Here is what needs to be done. 1. Find the following string:     &lt;Password vt="1" /&gt; "&lt;" means "<", "&gt;" means ">", "vt" means "Variable Type", variable type 1 is "NULL", so the above string can be translated to "<Password/>" 2. Replace it with:     &lt;Password vt="8"&gt;MyPassword&lt;/Password&gt;    variable type 8 is "string", the above string can be transalted to "<Password>MyPassword</Password>"   Binding file uses a lot of character entity references for XML character encoding purpose. For a list of the special charactor entiy references, you can check from here. ...Edmund Zhao

    Read the article

  • How to add a permanent redirect (301) for an htm file in IIS 7

    - by bconlon
    Looking in Web Analytics I could see several external sites pointing at an old .htm file on my web server that no longer existed, so I thought I would get IIS to redirect to the new .aspx replacement. How hard could it be? This has annoyed me for quite a while today so here is the answer. 1. Install the Http Redirection module - this is not installed by default!! Windows 7 Start->Control Panel->Programs and Features->Turn Windows Features on or off. Internet Information Services->World Wide Web Services->Common Http Features->HTTP Redirection. Windows Server 2008 Start->Administrative Tools->Server Manager. Roles->Web Server (IIS). Role Services->Add Role Services. Common Http Features->HTTP Redirection. 2. Edit your web.config file <configuration>     .....     <location path="oldfile.htm">         <system.webServer>             <httpRedirect enabled="true" destination="/newfile.aspx" exactDestination="true" childOnly="true" httpResponseStatus="Permanent" />         </system.webServer>     </location>     ..... </configuration> When a user clicks or Google crawls ‘oldfile.htm’ it will get a permanent redirect to ‘/newfile.aspx’ - and should take any Page Rank to the new file.  #

    Read the article

  • Very heavy .PDF file. How to handle it?

    - by Luigi
    I should print this file on paper, but it is heavy (2,3 MB) and the printing process is very slow. But this is not the whole problem. I should first create another .pdf file (grayscale) with four pages on each sheet. When I try to print this file as a .pdf file, the process of creation of the new file is even slower than the printing process, and the output file is much heavier than the original (hundreds of megabytes). How can I shrink the pdf file? Is there a way to create a printer friendly .pdf version of the file? Before you ask, I can't simply print it on pages. I must create this .pdf file with four pages on each sheet.

    Read the article

  • Software Center empty "No usefulness from server" "no username in config file"

    - by Theron G. Burrough
    I upgraded to 12.04 LTS and ran Software Center while a few things were running, and crashed. (2 GB RAM on a Netbook.) On reboot, Ubuntu One interface would not find any programs, neither installed (to run) nor available for download. So I launched Software Center, which opens, but does nothing. I click "Close" button and get a "Force Quit?" box. So I quit. Did research, learned to run Software Center from Terminal: 2012-04-29 23:14:36,978 - softwarecenter.backend.reviews - WARNING - Could not get usefulness from server, no username in config file Then tried the below without success: Reinstallation of software center: sudo apt-get install --reinstall software-center Didn't work. Found this in a post: remove the config file for software-center then log out and back in sudo rm -rf ~/.config/software-center Didn't work. Reinstalled Software Center with Synaptic Package Manager. Still no dice! And I am a Linux newbie, so I don't know where the Dickens that config file is. Help appreciated.

    Read the article

< Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >