Search Results

Search found 96184 results on 3848 pages for 'recent file list'.

Page 127/3848 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • Processing a list of atomic operations, allowing for interruptions

    - by JDB
    I'm looking for a design pattern that addresses the following situation: There exists a list of tasks that must be processed. Tasks may be added at any time. Each task is wholly independent from all other tasks. The order in which tasks are processed has no effect on the overall system or on the tasks themselves. Every task must be processed once and only once. The "main" process which launches the task processors may start and stop without warning. When stopped, the "main" process loses all in-memory data. Obviously this is going to involve some state, but are there any design patterns which discuss where and how to maintain that state? Are there any relevant anti-patterns? Named patterns are especially helpful so that we can discuss this topic with other organizations without having to describe the entire problem domain.

    Read the article

  • Best container to store this information

    - by user2368481
    I'm trying to write a smallish system as a homework excercise, I don't have much experience with containers and I'm not sure the best way of storing this data would be: Incident Records object holds instants of Incident Report. Report is a superclass which has 3 subclasses, Police, Fire or Medical. Record must must record which of these types apply, and which response teams are to be involved. So Record has to keep track of the Report objects, the type of the report (Police, Fire or Medical) and the teams involved in the reports. I was initially thinking of an array but that wouldn't be sufficient to hold all the info. Record<>---------Report<|----------Police, Fire or Medical

    Read the article

  • List of Application indicators

    - by user8592
    I installed Ubuntu 11.04 on one of my systems and I am using Unity interface. Unity is working quite nice so far but i really miss panel applets for net speed, cpu temp, and system monitor. These applets show useful quickinfo. Unlike 10.10, there is no other way to get these infos onto the panel or unity launcher. There are solutions like screenlets and conky but they don't feel appropriate for a clean desktop look. If you know then please list out any third party indicators with links so that they can be found at one place.

    Read the article

  • Ubuntu is not showing in LightDM menu list

    - by dearvivekkumar
    I am using using ubuntu 11.10. Everything was working perfectly this morning. At a moment "Ubuntu software center" application was unable to get started. So I restarted by system and when I logged in it started under GNOME. Again when I try to start in ubuntu session then it was not showing in menu list but Ubuntu 2D is coming. I don't come to know why this is happening. please let me know any fix for this issue. Thanks!

    Read the article

  • algoritm and structure pseucode [closed]

    - by user72759
    Create entries, which administers the test, which is 20 questions, the responses and fixed the 5 correct answer for each question. Write this test, if the execution of the algorithm to each recipe realisations change the order of questions and answers and responses are notified by user enters the correct answer. 2. Writin' example, where the line 2 is used stack. Create a stack of 20 items that the first element is 1, and the next is calculated by the formula an=an-1+1/an-1. Found this stack items that are whole numbers.

    Read the article

  • How to grep (or find) on cPanel?

    - by San
    How can I search for a specific string (function name or a variable name) in my files which are in various directories under cPanel file manager? I have been using a library directory and functions on that directory are used in various apps and pages. Now, I am in a situation to change something in the library file, for which I need to know the impact on files which use this library file functions. How to search / find / grep through the files hosted?

    Read the article

  • How to backup/restore full-disk encryption ubuntu 11.10?

    - by ggc
    How to backup/restore full-disk encryption ubuntu 11.10? I would like to put the RAW encrypted file system and restore on another computer. Encryption Details: crypt setup via Ubuntu alterate CD Installer only thing unencrypted is /boot File systems setup: boot- j swap-swap everything else-ext 4 Any suggestions? I have considered backing up the file system stripped of encryption, but I would prefer to keep the os encrypted while transferring. Thanks for any help!

    Read the article

  • How can I improve the performance of this double-for print?

    - by Florenc
    I have the following static method that prints the data imported from a 40.000 lines .xls spreadsheet. Now, it takes about 27 seconds to print the data in the console and the memory consumption is huge. import org.apache.poi.hssf.usermodel.*; import org.apache.poi.ss.usermodel.*; public static void printSheetData(List<List<HSSFCell>> sheetData) { for (int i = 0; i < sheetData.size(); i++) { List<HSSFCell> list = (List<HSSFCell>) sheetData.get(i); for (int j = 0; j < list.size(); j++) { HSSFCell cell = (HSSFCell) list.get(j); System.out.print(cell.toString()); if (j < list.size() - 1) { System.out.print(", "); } } System.out.println(""); } } Disclaimer: I know, I know large data belong to a database, don't print output in the console, premature optimization is the root of all evils...

    Read the article

  • File copying utility like rsync with error handling like ddrescue, for data recovery from a hard drive with bad sectors or hardware failure

    - by purefusion
    I have a hard drive with either bad blocks or sectors that are failing to read due to potential mechanical issues, such as a bad disk head, bad motor, or some other issue that is causing the hard drive to read data excruciatingly slowly and with lots of read errors. I'm seeing an average of 50 KB/sec, with some reads dropping below 10 KB/sec, and frequently it gets stuck on a file or sector altogether, usually for quite a long time—from 2-10 minutes or more (when using rsync, before it times out). Speed seems to vary wildly, and it gets stuck on files a lot, and when it finally gets "unstuck" it only seems to last for a short burst before it gets stuck again. The drive is also very quiet with only an occasional sound of files copying (usually when it gets stuck/unstuck for a brief time, before getting stuck again). Thus, there are none of those evil sounds that are normally associated with HDD death. Someone suggested that the problems sounded like they might be caused by a misaligned disk head, which requires a lot of re-reads before it finally reads data with success. Sounds plausible, but I digress... Anyway, the problem with rsync is that it seems to have no decent error handling support. Obviously, it wasn't meant for use in recovering data from failing hard drives, but all the so-called "data recovery" utilities out there that are meant for such use usually focus on recovery of deleted files or messed up partitions, rather than copying files off dying hard drives. Deleted file recovery is not what I need, obviously, so perhaps you can understand my disappointment in not being able to find what I'm after yet. Naturally, this is where you'd probably say "You should use ddrescue!" Well, that's all fine and dandy, but I've already got most of the data backed up, so I just want to recover certain files. I'm not concerned with trying to recover a full partition block-by-block as ddrescue does. I am only interested in rescuing just specific files and directories. Ideally, what I'd like is some sort of cross between rsync and ddrescue: something that lets me specify source and destination as directories of normal files like rsync (rather than two full partitions as ddrescue requires), with a way to skip files with errors in an initial run, and then allows me to attempt recovery of those files with errors in a later run (with a slightly altered command, of course), perhaps even offering an option to specify the number of retry attempts ...just like how ddrescue works with blocks, only I want a utility that works with specific files/directories like rsync does. So am I daydreaming here, or does something out there exist that can do this? Or, maybe even a way to make rsync or ddrescue work in such a way? I'm really open to whatever solutions might work, so long as they let me choose which files I want to "rescue", and can skip files with errors in the initial run, and try/retry those errors again later. So far I've tried rsync with the following options, but it often gets stuck on a file for longer than the timeout, and ideally I'd just like it to move on to the next file and come back later to the files it gets stuck on. I don't think that's possible though. Anyway, here's what I've been using up till now: rsync -avP --stats --block-size=512 --timeout=600 /path/to/source/* /path/to/destination/

    Read the article

  • Linux file names & file globbing

    - by John Lelos
    I have a list of files named: file000 file001 file002 file003 ... file1100 How can I match all files that have a number greater than 800 but less than 1000 ? I am using linux bash Thank you Edit Actually, my files are named like: ab869.enc cp936.enc g122345.enc x2022.enc abc8859-14.enc aax5601.enc cp936-1.enc so the first solution dont match the correct files :( How can I match files that have number between 800-999 ?

    Read the article

  • Problem When Compressing File using vb.net

    - by Amr Elnashar
    I have File Like "Sample.bak" and when I compress it to be "Sample.zip" I lose the file extension inside the zip file I meann when I open the compressed file I find "Sample" without any extension. I use this code : Dim name As String = Path.GetFileName(filePath).Replace(".Bak", "") Dim source() As Byte = System.IO.File.ReadAllBytes(filePath) Dim compressed() As Byte = ConvertToByteArray(source) System.IO.File.WriteAllBytes(destination & name & ".Bak" & ".zip", compressed) Or using this code : Public Sub cmdCompressFile(ByVal FileName As String) 'Stream object that reads file contents Dim streamObj As Stream = New StreamReader(FileName).BaseStream 'Allocate space in buffer according to the length of the file read Dim buffer(streamObj.Length) As Byte 'Fill buffer streamObj.Read(buffer, 0, buffer.Length) streamObj.Close() 'File Stream object used to change the extension of a file Dim compFile As System.IO.FileStream = File.Create(Path.ChangeExtension(FileName, "zip")) 'GZip object that compress the file Dim zipStreamObj As New GZipStream(compFile, CompressionMode.Compress) 'Write to the Stream object from the buffer zipStreamObj.Write(buffer, 0, buffer.Length) zipStreamObj.Close() End Sub please I need to compress the file without loosing file extension inside compressed file. thanks,

    Read the article

  • FTP Server upload and filesystem questions

    - by Alex
    I'm a photographer who mainly does event photography. A while ago I bought myself a Nikon WT-4 wireless transmitter, a small device which connects via USB to my Nikon D700 DSLR, and then establishes a WiFi connection to an existing WLAN. It can then upload any pictures I take via FTP to an FTP server somewhere in the network. On my laptop I then have a piece of software which will check a given folder on the disk regularly, this software is smart enough to look at the modified file timestamp, if this timestamp is less than 10 seconds ago, it will not attempt to import the folder and skip the file in this iteration of the import scan. The problem I've discovered seems to be inherent to the FTP protocol, as I have the same problem with Windows 7 built in IIS server, as I do with FileZilla FTP server. When the transmitter starts to upload a file, the FTP server will create a small 300-500 KB file with the correct filename on the disk, but then do nothing with the file until it has completely received the file via FTP. So it seems to create this small dummy file, and then buffer the remainder of the FTP upload until it's finished, and then dump the rest of the file into the dummy file making it the correct size. Problem is, these uploads take about 15-30 seconds depending on reception, but since the folder watch tool will already try to import any file older than 10 seconds, it will always try to import the small dummy files which obviously fails as they're not copmlete yet. Is there any way to 'disable' this behaviour? Ideally I would like my file only to show up once it's been completely uploaded. Or perhaps someone knows another FTP server application (it has to run on win7) which does not show this behaviour?

    Read the article

  • .NET converting simple arrays to List Generics

    - by Manish Sinha
    This question might seem trivial and also stupid at the first glance, but it is much more than this. I have an array of any type T (T[]) and I want to convert it into a List generic (List<T>). Is there any other way apart from creating a Generic list, traversing the whole array and adding the element in the List? Present Situation: string[] strList = {'foo','bar','meh'}; List<string> listOfStr = new List<string>(); foreach(string s in strList) { listOfStr.Add(s); } My ideal situation: string[] strList = {'foo','bar','meh'}; List<string> listOfStr = strList.ToList<string>(); Or: string[] strList = {'foo','bar','meh'}; List<string> listOfStr = new List<string>(strList); I am suggesting the last 2 method names as I think compiler or CLR can perform some optimizations on the whole operations if It want inbuilt. P.S.: I am not talking about the Array or ArrayList Type

    Read the article

  • maintaining a large list in python

    - by Oren
    I need to maintain a large list of python pickleable objects that will quickly execute the following algorithm: The list will have 4 pointers and can: Read\Write each of the 4 pointed nodes Insert new nodes before the pointed nodes Increase a pointer by 1 Pop nodes from the beginning of the list (without overlapping the pointers) Add nodes at the end of the list The list can be very large (300-400 MB), so it shouldnt be contained in the RAM. The nodes are small (100-200 bytes) but can contain various types of data. The most efficient way it can be done, in my opinion, is to implement some paging-mechanism. On the harddrive there will be some database of a linked-list of pages where in every moment up to 5 of them are loaded in the memory. On insertion, if some max-page-size was reached, the page will be splitted to 2 small pages, and on pointer increment, if a pointer is increased beyond its loaded page, the following page will be loaded instead. This is a good solution, but I don't have the time to write it, especially if I want to make it generic and implement all the python-list features. Using a SQL tables is not good either, since I'll need to implement a complex index-key mechanism. I tried ZODB and zc.blist, which implement a BTree based list that can be stored in a ZODB database file, but I don't know how to configure it so the above features will run in reasonable time. I don't need all the multi-threading\transactioning features. No one else will touch the database-file except for my single-thread program. Can anyone explain me how to configure the ZODB\zc.blist so the above features will run fast, or show me a different large-list implementation?

    Read the article

  • Open an excel file using COM and save it as .xml file

    - by chupinette
    Hi. Im trying the following code: <?php $workbook = "D:\b2\\test.XLS"; $sheet = "Sheet1"; #Instantiate the spreadsheet component. $ex = new COM("Excel.sheet") or Die ("Did not connect"); #Get the application name and version print "Application name:{$ex->Application->value}<BR>" ; print "Loaded version: {$ex->Application->version}<BR>"; #Open the workbook that we want to use. $wkb = $ex->application->Workbooks->Open($workbook) or Die ("Did not open"); #Create a copy of the workbook, so the original workbook will be preserved. $ex->Application->ActiveWorkbook->SaveAs("D:\b2\Ourtest.xml"); #$ex->Application->Visible = 1; #Uncomment to make Excel visible. #Optionally, save the modified workbook $ex->Application->ActiveWorkbook->SaveAs("D:\Ourtest.xml"); #Close all workbooks without questioning $ex->application->ActiveWorkbook->Close("False"); unset ($ex); ?> This actually works and creates the Ourtest.xml file. But im getting characters like: ÐÏࡱá þÿ þÿÿÿ I have tried with SaveAs("D:\Ourtest.pdf") and it says the file has been corrupted or incorrectly decoded. Can anyone help me please?Thanks

    Read the article

  • Replace delimited block of text in file with the contents of another file

    - by rmarimon
    I need to write a simple script to replace a block of text in a configuration file with the contents of another file. Let's assume with have the following simplified files: server.xml <?xml version='1.0' encoding='UTF-8'?> <Server port="8005" shutdown="SHUTDOWN"> <Service name="Catalina"> <Connector port="80" protocol="HTTP/1.1"/> <Engine name="Catalina" defaultHost="localhost"> <!-- BEGIN realm --> <sometags/> <sometags/> <!-- END realm --> <Host name="localhost" appBase="webapps"/> </Engine> </Service> </Server> realm.xml <Realm className="org.apache.catalina.realm.UserDatabaseRealm" resourceName="UserDatabase"/> I want to run a script and have realm.xml replace the contents between the <!-- BEGIN realm --> and <!-- END realm --> lines. If realm.xml changes then whenever the script is run again it will replace the lines again with the new contents of realm.xml. This is intended to be run in /etc/init.d/tomcat on startup of the service on multiple installations on which the realm is going to be different. I'm not so sure how can I do this simply with awk or sed.

    Read the article

  • PHP: What is an efficient way to parse a text file containing very long lines?

    - by Shaun
    I'm working on a parser in php which is designed to extract MySQL records out of a text file. A particular line might begin with a string corresponding to which table the records (rows) need to be inserted into, followed by the records themselves. The records are delimited by a backslash and the fields (columns) are separated by commas. For the sake of simplicity, let's assume that we have a table representing people in our database, with fields being First Name, Last Name, and Occupation. Thus, one line of the file might be as follows [People] = "\Han,Solo,Smuggler\Luke,Skywalker,Jedi..." Where the ellipses (...) could be additional people. One straightforward approach might be to use fgets() to extract a line from the file, and use preg_match() to extract the table name, records, and fields from that line. However, let's suppose that we have an awful lot of Star Wars characters to track. So many, in fact, that this line ends up being 200,000+ characters/bytes long. In such a case, taking the above approach to extract the database information seems a bit inefficient. You have to first read hundreds of thousands of characters into memory, then read back over those same characters to find regex matches. Is there a way, similar to the Java String next(String pattern) method of the Scanner class constructed using a file, that allows you to match patterns in-line while scanning through the file? The idea is that you don't have to scan through the same text twice (to read it from the file into a string, and then to match patterns) or store the text redundantly in memory (in both the file line string and the matched patterns). Would this even yield a significant increase in performance? It's hard to tell exactly what PHP or Java are doing behind the scenes.

    Read the article

  • How to create a generic list in this wierd case in c#

    - by Marc Bettex
    Hello, In my program, I have a class A which is extended by B, C and many more classes. I have a method GetInstance() which returns a instance of B or C (or of one of the other child), but I don't know which one, so the return type of the method is A. In the method CreateGenericList(), I have a variable v of type A, which is in fact either a B, a C or another child type and I want to create a generic list of the proper type, i.e. List<B> if v is a B or List<C> if v is a C, ... Currently I do it by using reflection, which works, but this is extremely slow. I wanted to know if there is another way to to it, which doesn't use reflection. Here is an example of the code of my problem: class A { } class B : A { } class C : A { } // More childs of A. class Program { static A GetInstance() { // returns an instance of B or C } static void CreateGenericList() { A v = Program.GetInstance(); IList genericList = // Here I want an instance of List<B> or List<C> or ... depending of the real type of v, not a List<A>. } } I tried the following hack. I call the following method, hoping the type inferencer will guess the type of model, but it doesn't work and return a List<A>. I believe that because c# is statically typed, T is resolved as A and not as the real type of model at runtime. static List<T> CreateGenericListFromModel<T>(T model) where T : A { return new List<T> (); } Does anybody have a solution to that problem that doesn't use reflection or that it is impossible to solve that problem without reflection? Thank you very much, Marc

    Read the article

  • Transfering a set with a Wildcarded Generic to a List in Java

    - by Daniel Bingham
    I have a data type that contains a set and a method that expects List<? extends MyClass>. The data type has Set<? extends MyClass>. I need to be able to move the stuff out of the set and into the List. The order it goes into the list doesn't matter, it just needs to start keeping track of it so that it can be reordered when displayed. Suffice to say that changing the Set into a List in the data type is out of the question here. This seems pretty easy at first. Create a new method that takes a Set instead of a List, changes it into a list and then passes it on to the old method that just took a list. The problem comes in changing the set to a list. public void setData(Set<? extends MyClass> data) { List<? extends Myclass> newData = ArrayList< /* What goes here? */ >(); for(ConcordaEntityBean o : data) { newData.add(o); } setData(newData); } Obviously, I can't instantiate an ArrayList with a wildcard, it chokes. I don't know the type at that point. Is there some way to pull the type out of data and pass it to ArrayList? Can I just instantiate it with MyClass? Is there some other way to do this?

    Read the article

  • Distinct() to return List<> returning Duplicates

    - by KDM
    I have a list of Filters that are passed into a webservice and I iterate over the collection and do Linq query and then add to the list of products but when I do a GroupBy and Distinct() it doesn't remove the duplicates. I am using a IEnumerable because when you use Disinct it converts it to IEnumerable. If you know how to construct this better and make my function return a type of List<Product> that would be appreciated thanks. Here is my code in C#: if (Tab == "All-Items") { List<Product> temp = new List<Product>(); List<Product> Products2 = new List<Product>(); foreach (Filter filter in Filters) { List<Product> products = (from p in db.Products where p.Discontinued == false && p.DepartmentId == qDepartment.Id join f in db.Filters on p.Id equals f.ProductId join x in db.ProductImages on p.Id equals x.ProductId where x.Dimension == "180X180" && f.Name == filter.Name /*Filter*/ select new Product { Id = p.Id, Title = p.Title, ShortDescription = p.ShortDescription, Brand = p.Brand, Model = p.Model, Image = x.Path, FriendlyUrl = p.FriendlyUrl, SellPrice = p.SellPrice, DiscountPercentage = p.DiscountPercentage, Votes = p.Votes, TotalRating = p.TotalRating }).ToList<Product>(); foreach (Product p in products) { temp.Add(p); } IEnumerable temp2 = temp.GroupBy(x => x.Id).Distinct(); IEnumerator e = temp.GetEnumerator(); while (e.MoveNext()) { Product c = e.Current as Product; Products2.Add(c); } } pf.Products = Products2;// return type must be List<Product> }

    Read the article

  • XP, how to apply security to files, now have simple file sharing and can't access some files from other machines ?

    - by Jules
    For a month or two now I've been using simple file sharing, for several months before that I didn't, then before that I had simple file sharing tuned on. So at the moment I don't have a security tab (on files or folders) or sharing permissions settings there too. As an example, from another machine, I can access files from 2007 but not from the summer of last year in the same folder. I can access all files on that local machine. So I think I just need to re-apply security or permissions somehow? What should I do?

    Read the article

  • How can I edit local security policy from a batch file?

    - by Stephen Jennings
    I am trying to write a utility as a batch file that, among other things, adds a user to the "Deny logon locally" local security policy. This batch file will be used on hundreds of independent computers (not on a domain and aren't even on the same network). I assumed one of the following were my options, but perhaps there's one I haven't thought of. A command line utility similar to net.exe which can modify local security policy. A VBScript sample to do the same. Write my own using some WMI or Win32 calls. I'd rather not do this one if I don't have to.

    Read the article

  • Increasing file descriptor limit on Debian does not work! Help!

    - by Aco
    I am running Debian 6 and I am trying to increase the file descriptor limit but it does not want to work. This is what I have done: I edited /etc/sysctl.conf by adding fs.file-max = 64000 at the end and applied the changes using sysctl -p. I then edited /etc/security/limits.conf and added the following lines: * soft nofile 64000 and * hard nofile 64000. Now when I execute ulimit -Hn and ulimit -Sn I still see 1024. I rebooted the server and I still get the same result. What have I failed to do?

    Read the article

  • Plesk file permissions - Apache/PHP conflicting with user accounts.

    - by hfidgen
    Hiya, I'm building a Drupal site which performs various automatic disk operations using the apache user (id=40). The problem is that the site was set up on a subdomain belonging to user ID 10001 (ie my main FTP account) so the filesystem belongs to that user ID. So I keep getting errors like this: warning: move_uploaded_file() [function.move-uploaded-file]: SAFE MODE Restriction in effect. The script whose uid is 10001 is not allowed to access /var/www/vhosts/domain.com/httpdocs/sites/default/files/images/user owned by uid 48 in /var/www/vhosts/domain.com/httpdocs/includes/file.inc on line 579. I've tried changing the apache group in httpd.conf to apache:psacln, psacln being the default group for all web users but that's not helped. The situation now is: ..../files/images/ = 777 and chown = ftplogin:psacln ..../files/images/user = 775 and chown = apache:psacln ..../files/tmp = 777 and chown = ftplogin:psacln So apparently uid 40 and 10001 both have permissions to write to any of the 3 directories involved, but still can't. Am i missing something here? Can anyone help? Thanks!

    Read the article

  • Puppet Directory and File ownership ignored

    - by Phil Sturgeon
    Puppet seems to be lying to me, which is not very nice. I am trying to set some files and directories included in /vagrant/src to be 666 and 777, and set the ownership group to the correct Apache user (using the PuppetLabs Apache module). Output from Puppet says yes. [default] Running provisioner: Vagrant::Provisioners::Puppet... [default] Running Puppet with /tmp/vagrant-puppet/manifests/default.pp... stdin: is not a tty No LSB modules are available. warning: require is a metaparam; this value will inherit to all contained resources warning: notify is a metaparam; this value will inherit to all contained resources notice: /Stage[main]//File[/vagrant/src/addons/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/addons/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/addons/]/mode: mode changed '0755' to '0777' notice: /Stage[main]//Package[curl]/ensure: ensure changed 'purged' to 'present' notice: /Stage[main]//File[/vagrant/src/system/cms/config/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/config/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/config/]/mode: mode changed '0755' to '0777' notice: /Stage[main]//File[/vagrant/src/system/cms/config/config.php]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/config/config.php]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/cache/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/cache/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/system/cms/cache/]/mode: mode changed '0755' to '0777' notice: /Stage[main]//File[/vagrant/src/uploads/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/uploads/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/uploads/]/mode: mode changed '0755' to '0777' notice: /Stage[main]/Apache/Service[httpd]/ensure: ensure changed 'stopped' to 'running' notice: /Stage[main]//File[/vagrant/src/assets/cache/]/owner: owner changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/assets/cache/]/group: group changed 'vagrant' to 'www-data' notice: /Stage[main]//File[/vagrant/src/assets/cache/]/mode: mode changed '0755' to '0777' notice: Finished catalog run in 2.29 seconds Output from ls -lah says no: $ ls -lah /vagrant/src/ total 36K drwxr-xr-x 1 vagrant vagrant 510 2012-07-03 00:11 . drwxr-xr-x 1 vagrant vagrant 340 2012-07-03 08:08 .. drwxr-xr-x 1 vagrant vagrant 136 2012-07-03 00:11 addons drwxr-xr-x 1 vagrant vagrant 102 2012-07-03 00:11 assets drwxr-xr-x 1 vagrant vagrant 510 2012-07-03 07:45 .git -rw-r--r-- 1 vagrant vagrant 1.3K 2012-07-03 00:11 .gitignore -rwxr-xr-x 1 vagrant vagrant 1.4K 2012-07-03 00:11 .htaccess -rwxr-xr-x 1 vagrant vagrant 8.8K 2012-07-03 00:11 index.php drwxr-xr-x 1 vagrant vagrant 442 2012-07-03 00:11 installer -rwxr-xr-x 1 vagrant vagrant 2.8K 2012-07-03 00:11 LICENSE -rw-r--r-- 1 vagrant vagrant 1.1K 2012-07-03 00:11 phpdoc.dist.xml -rw-r--r-- 1 vagrant vagrant 3.3K 2012-07-03 00:11 README.md drwxr-xr-x 1 vagrant vagrant 204 2012-07-03 00:11 system -rw-r--r-- 1 vagrant vagrant 42 2012-07-03 00:11 .travis.yml drwxr-xr-x 1 vagrant vagrant 102 2012-07-03 00:11 uploads Whats up with that? My entire config can be found here.

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >