Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 492/1981 | < Previous Page | 488 489 490 491 492 493 494 495 496 497 498 499  | Next Page >

  • Merge Multple Worksheets From Multple Workbooks

    - by Droter
    Hi, I have found multiple posts on merging data but I am still running into some problems. I have multiple files with multiple sheets. Example 2007-01.xls...2007-12.xls in each of these files are daily data on sheets labeled 01, 02, 03 ..... There are other sheets in the file so I can't just loop through all worksheets. I need to combine the daily data into monthly data, then all of the monthly data points into yearly. On the monthly data I need it to be added to the bottom of the page. I have added the file open changes for Excel 2007 Here is what I have so far: Sub RunCodeOnAllXLSFiles() Dim lCount As Long Dim wbResults As Workbook Dim wbMaster As Workbook Application. ScreenUpdating = False Application.DisplayAlerts = False Application.EnableEvents = False On Error Resume Next Set wbMaster = ThisWorkbook Dim oWbk As Workbook Dim sFil As String Dim sPath As String sPath = "C:\Users\test\" 'location of files ChDir sPath sFil = Dir("*.xls") 'change or add formats Do While sFil <> "" 'will start LOOP until all files in folder sPath have been looped through Set oWbk = Workbooks.Open(sPath & "\" & sFil) 'opens the file Set oWbk = Workbooks.Open(sPath & "\" & sFil) Sheets("01").Select ' HARD CODED FIRST DAY Range("B6:F101").Select 'AREA I NEED TO COPY Range("B6:F101").Copy wbMaster.Activate Workbooks("wbMaster").ActiveSheet.Range("B65536").End(xlUp)(2).PasteSpecial Paste:=xlValues Application.CutCopyMode = False oWbk.Close True 'close the workbook, saving changes sFil = Dir Loop ' End of LOOP On Error Goto 0 Application.ScreenUpdating = True Application.DisplayAlerts = True Application.EnableEvents = True End Sub Right now it can find the files and open them up and get to the right worksheet but when it tries to copy the data nothing is copied over. Thanks for your help, Matt

    Read the article

  • Recursive multiline sed - remove beginning of file until pattern match.

    - by yaya3
    I have nested subdirectories containing html files. For each of these html files I want to delete from the top of the file until the pattern <div id="left- This is my attempt from osx's terminal: find . -name "*.html" -exec sed "s/.*?<div id=\"left-col/<div id=\"left-col/g" '{}' \; I get a lot of html output in the termainal, but no files contain the substitution or are written Thanks

    Read the article

  • How do you update the server name in source indexed symbol file?

    - by Keith Hill
    With the Debugging Tools for Windows you can run SSIndex.cmd against your symbol files and it will embed the command to retrieve each source code file from the TF server. We have a bunch of indexed files and recently our IT migrated our TFS 2008 installation to TFS 2010 and in the process changed the server name. Question is, how can I update all these symbol files to point to the new server? I thought SSindex used an alternate data stream named 'srcsrv' but SysInternals' streams.exe shows nothing on these symbol files even though srctool.exe shows the data.

    Read the article

  • How do I relocate assemblies from a deployment project without breaking application references?

    - by James
    Hi, I have recently refactored a lot of my applications existing code and I am now looking at tidying up the deployment side of things. The existing installer application installs everything in the application folder (with the exclusion of a couple of config files which are located in a sub folder). However, I have multiple applications which all use some common assemblies and my goal is to relocate a these particular assemblies to the "Common Files" folder in the program files directory. NB: I have read a lot about the GAC but I have no experience with it and also read a few horror stories, so trying to get a simple solution for the time being. I managed to get the assemblies installed into the Common Files folder, however, as a result (typical I.T.) I have broken my app! If I copy the assemblies back into the application folder it works fine so the problem is obviously to do with how my app is referencing the assemblies. To get the installer to install the assemblies into the Common Files folder I just updated the Folder property of each assembly in the Detected Dependencies list. My thoughts were when I did that the installer would somehow update my application to tell it to look in that folder for them but that doens't appear to be the case. What exactly am I doing wrong here?

    Read the article

  • Net::SMTPFatalError in rails 2.3.4

    - by Brian Roisentul
    I'm getting the following error when trying to send email on rails 2.3.4(it worked on 2.3.2) using action_mailer_tls plugin: Net::SMTPFatalError in UsersController#create 555 5.5.2 Syntax error. w3sm66205164ybi.9 C:/Program Files (x86)/NetBeans 6.8/ruby2/jruby-1.4.0/lib/ruby/1.8/net/smtp.rb:930:in `check_response' C:/Program Files (x86)/NetBeans 6.8/ruby2/jruby-1.4.0/lib/ruby/1.8/net/smtp.rb:899:in `getok' C:/Program Files (x86)/NetBeans 6.8/ruby2/jruby-1.4.0/lib/ruby/1.8/net/smtp.rb:828:in `mailfrom' C:/Program Files (x86)/NetBeans 6.8/ruby2/jruby-1.4.0/lib/ruby/1.8/net/smtp.rb:653:in `send_message' C:/Ruby/lib/ruby/gems/1.8/gems/actionmailer-2.3.4/lib/action_mailer/base.rb:683:in `perform_delivery_smtp' C:/Program Files (x86)/NetBeans 6.8/ruby2/jruby-1.4.0/lib/ruby/1.8/net/smtp.rb:526:in `start' C:/Ruby/lib/ruby/gems/1.8/gems/actionmailer-2.3.4/lib/action_mailer/base.rb:681:in `perform_delivery_smtp' C:/Ruby/lib/ruby/gems/1.8/gems/actionmailer-2.3.4/lib/action_mailer/base.rb:523:in `deliver!' C:/Ruby/lib/ruby/gems/1.8/gems/actionmailer-2.3.4/lib/action_mailer/base.rb:395:in `method_missing' D:/Proyectos/Cursometro/www/app/models/user_observer.rb:3:in `after_create' D:/Proyectos/Cursometro/www/app/controllers/users_controller.rb:221:in `create_new_user' D:/Proyectos/Cursometro/www/app/controllers/users_controller.rb:101:in `create' This has happened after I changed the following line at action_mailer/

    Read the article

  • Excel File opened under Ubuntu, read by R, OpenOffice

    - by Vishal Belsare
    I have a bunch of Excel files which are updated on a daily basis on a Windows machine. I transfer them to a Ubuntu machine and want to open them there. Specifically, I want to read the files as a database under R. A couple years ago, I used ODBC under a Windows machine to open Excel files through R. Is there any way, I can do this with R under Ubuntu? I could make a database .ODB file for the corresponding XLS files using OpenOffice, but I don't know of a way to connect to a .ODB database. OpenOffice seems to have ways to connect TO databases, but no way to connect to ODB. Thanks for any potential solutions.

    Read the article

  • jmeter and mulitple http requests

    - by JosiP
    Hello I need to test my web server, but i need to measure it with lot of files with different size. I got about 500-1000 that kind of files. Can You please tell me how to do it in jmeter ? Maybe there is an option where i can put file with my files to retrive list ?? thx in advance

    Read the article

  • Add a file in a folder with bash

    - by CuriousGeorge
    I have figured out how to remove bad files in my folders but I am wanting to know how to add certain named files to that folder. I want to add something like address.xml I have this and can remove the bad files. for addxml in $(find $DIRECTORY/$directoryname -name address.xml); do rm -v $addxml done I am trying to learn how later down the code I can add a file from another folder no where near the folders that are being edited.

    Read the article

  • Source code versioning with comments (organizational practice) - leave or remove?

    - by ADTC
    Before you start admonishing me with "DON'T DO IT," "BAD PRACTICE!" and "Learn to use proper source code control", please hear me out first. I am fully aware that the practice of commenting out old code and leaving it there forever is very bad and I hate such practice myself. But here's the situation I'm in. A few months ago I joined a company as software developer. I had worked in the company for few months as an intern, about a year before joining recently. Our company uses source code version control (CVS) but not properly. Here's what happened both in my internship and my current permanent position. Each time I was assigned to work on a project (legacy, about 8-10 years old). Instead of creating a CVS account and letting me check out code and check in changes, a senior colleague exported the code from CVS, zipped it up and passed it to me. While this colleague checks in all changes in bulk every few weeks, our usual practice is to do fine-grained versioning in the actual source code itself (each file increments in versions independent from the rest). Whenever a change is made to a file, old code is commented out, new code entered below it, and this whole section is marked with a version number. Finally a note about the changes is placed at the top of the file in a section called Modification History. Finally the changed files are placed in a shared folder, ready and waiting for the bulk check-in. /* * Copyright notice blah blah * Some details about file (project name, file name etc) * Modification History: * Date Version Modified By Description * 2012-10-15 1.0 Joey Initial creation * 2012-10-22 1.1 Chandler Replaced old code with new code */ code .... //v1.1 start //old code new code //v1.1 end code .... Now the problem is this. In the project I'm working on, I needed to copy some new source code files from another project (new in the sense that they didn't exist in destination project before). These files have a lot of historical commented out code and comment-based versioning including usually long or very long Modification History section. Since the files are new to this project I decided to clean them up and remove unnecessary code including historical code, and start fresh at version 1.0. (I still have to continue the practice of comment-based versioning despite hating it. And don't ask why not start at version 0.1...) I have done similar something during my internship and no one said anything. My supervisor has seen the work a few times and didn't say I shouldn't do such clean-up (if at all it was noticed). But a same-level colleague saw this and said it's not recommended as it may cause downtime in the future and increase maintenance costs. An example is when changes are made in another project on the original files and these changes need to be propagated to this project. With code files drastically different, it could cause confusion to an employee doing the propagation. It makes sense to me, and is a valid point. I couldn't find any reason to do my clean-up other than the inconvenience of a ridiculously messy code. So, long story short: Given the practice in our company, should I not do such clean-up when copying new files from project to project? Is it better to make changes on the (copy of) original code with full history in comments? Or what justification can I give for doing the clean-up? PS to mods: Hope you allow this question some time even if for any reason you determine it to be unfit in SO. I apologize in advance if anything is inappropriate including tags.

    Read the article

  • Committing file deletions to svn repository whilst ignoring some other local mods

    - by TheJuice
    I have svn repository where I have scheduled some files and folders to be moved in the repository with svn mv. I also have some files that are peers of the files to be moved that have local modifications of which I only want a subset of those files to be committed along with the moves. e.g. the output of svn st would look like: D foo/bar D foo/bar/a.txt D foo/bar/b.txt M foo/exclude.txt M foo/include.txt A foo/whiz/bar A + foo/whiz/bar/c.txt A + foo/whiz/bar/d.txt To commit to the moves to the repository, I would need to perform the commit on foo but that would also commit the modifications to foo/exclude.txt and foo/include.txt. How would I commit only the deletions/additions as a result of the move plus the mods to foo/include.txt whilst excluding foo/exclude.txt? I have a feeling the answer lies with the --depth argument to svn ci but it's not clear to me how it will operate.

    Read the article

  • Ant - using vssadd to add multiple file

    - by mamendex
    Hi, I'm trying to use vssadd task to add a tree of source files to a recent created project on VSS. But it happens to be adding only the folder tree, all files missing. <vsscp vsspath="$/DEV/APL_${version}" ssdir="${vssapl}" serverPath="${vsssvr}"/> The vssadd task displays the name of the folders it's creating: ... (vssadd) $/DEV/APL_0.0.10c/src/domain: (vssadd) $/DEV/APL_0.0.10c/src/mbeans: (vssadd) $/DEV/APL_0.0.10c/src/service: ... The script runs successfully but the files never get in the repository. Trying to use wilcards are no good, the task says it found no matching files and ss returns with a code of 100: <vssadd ssdir="${vssapl}" localPath="C:\Workspace\APL_Build*.*" recursive="true" serverPath="${vsssvr}" comment="Build ${versao} at ${to.timestamp}"/ I've noticed that vssadd does not accept fileset tag either, so I'm kind of lost here. Any tips? tks

    Read the article

  • Loading .dll/.exe from file into temporary AppDomain throws Exception

    Hi Gang, I am trying to make a Visual Studio AddIn that removes unused references from the projects in the current solution (I know this can be done, Resharper does it, but my client doesn't want to pay for 300 licences). Anyhoo, I use DTE to loop through the projects, compile their assemblies, then reflect over those assemblies to get their referenced assemblies and cross-examine the .csproj file. Problem: since the .dll/.exe I loaded up with Reflection doesn't unload until the app domian unloads, it is now locked and the projects can't be built again because VS tries to re-create the files (all standard stuff). I have tried creating temporary files, then reflecting over them...no worky, still have locked original files (I totally don’t understand that BTW). Now I am now going down the path of creating a temporary AppDomain to load the files into and then destroy. I am having problems loading the files though: The way I understand AddDomain.Load is that I should create and send a byte array of the assembly to it. I do that: FileStream fs = new FileStream(assemblyFile, FileMode.Open); byte[] assemblyFileBuffer = new byte[(int)fs.Length]; fs.Read(assemblyFileBuffer, 0, assemblyFileBuffer.Length); fs.Close(); AppDomainSetup domainSetup = new AppDomainSetup(); domainSetup.ApplicationBase = assemblyFileInfo.Directory.FullName; AppDomain tempAppDomain = AppDomain.CreateDomain("TempAppDomain", null, domainSetup); Assembly projectAssembly = tempAppDomain.Load(assemblyFileBuffer); The last line throws an exception: "Could not load file or assembly 'WindowsFormsApplication1, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.":"WindowsFormsApplication3, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"}" Any help or thoughts would be greatly appreciated. My head is lopsided from beating it against the wall... Thanks, Dan

    Read the article

  • Want to create an open file with filter in eclipse3.4

    - by Ravisha
    I am using eclipse 3.4.I often keep searching for files in my project.I wanted to create a file search with filter.Like i should be able to configure the filter in preferences.later when i press ctrl+shift+ F (Assuming this is what i give the shortcut) it should display only those files. Why i came up with this is bcos i might want to avoid java files while searching.So life becomes little easier.

    Read the article

  • Uninstalling demo/trial of Visual Studio 2008 Team System

    - by Ian Ringrose
    I wish to uninstall the trail copy of VS 2008 Team System, as the trial is coming to its end. I had VS 2008 Professional Edition installed on the machine to start with and it still shows up in Add/Remove Problems. I am hoping that when I uninstall VS 2008 Team System I will be left with a working VS 2008 Professional Edition. When I try to uninstall VS 2008 Team System, I very quickly get an error dialog that says: A problem has been encountered while loading the setup components. Canceling setup. Help! Progress or lack there of so fare I have done dir %temp%*.log in a command prompt and can see any log files that are recent I am going to read http://en.wikipedia.org/wiki/Windows_Installer#Diagnostic_logging to see if I can get any logging Aaron Stebner's WebLog has a post on where VS put's is log files, he also has a post on were some other products put there log files gives some info about where VS setup puts it's logs etc Aaron Ruckman provided me with the solution after I sent him the log files.

    Read the article

  • I'd like to know why a function executes fine when called from x but not when called from y

    - by Roland
    When called from archive(), readcont(char *filename) executes fine! Called from runoptions() though, it fails to list the files "archived"! why is this? The program must run in terminal. Use -h as a parameter to view the usage. This program is written to "archive" text files into ".rldzip" files. readcont( char *x) should show the files archived in file (*x) a) Successful call Use the program to archive 3 text files: rldzip.exe a.txt b.txt c.txt FILEXY -a archive() will call readcont and it will work showing the files archived after the binary FILEXY will be created. b) Unsuccessful call After the file is created, use: rldzip.exe FILEXY.rldzip -v You can see that the function crashes! I'd like to know why is this happening! /* Sa se scrie un program care: a) arhiveaza fisiere b) dezarhiveaza fisierele athivate */ #include<stdio.h> #include<stdlib.h> #include<conio.h> #include<string.h> struct content{ char *text; char *flname; }*arc; FILE *f; void readcont(char *x){ FILE *p; if((p = fopen(x, "rb")) == NULL){ perror("Critical error: "); exit(EXIT_FAILURE); } content aux; int i; fread(&i, sizeof(int), 1, p); printf("\nFiles in %s \n\n", x); while(i-- >1 && fread(&aux, sizeof(struct content), 1, p) != 0) printf("%s \n", aux.flname); fclose(p); printf("\n\n"); } void archive(int argc, char **argv){ int i; char inttext[5000], textline[1000]; //Allocate dynamic memory for the content to be archived! arc = (content*)malloc(argc * sizeof(content)); for(i=1; i< argc; i++) { if((f = fopen(argv[i], "r")) == NULL){ printf("%s: ", argv[i]); perror(""); exit(EXIT_FAILURE); } while(!feof(f)){ fgets(textline, 5000, f); strcat(inttext, textline); } arc[i-1].text = (char*)malloc(strlen(inttext) + 1); strcpy(arc[i-1].text, inttext); arc[i-1].flname = (char*)malloc(strlen(argv[i]) + 1); strcpy(arc[i-1].flname, argv[i]); fclose(f); } char *filen; filen=(char*)malloc(strlen(argv[argc])+1+7); strcpy(filen, argv[argc]); strcat(filen, ".rldzip"); f = fopen(filen, "wb"); fwrite(&argc, sizeof(int), 1, f); fwrite(arc, sizeof(content), argc, f); fclose(f); printf("Success! "); for(i=1; i< argc; i++) { (i==argc-1)? printf("and %s ", argv[i]) : printf("%s ", argv[i]); } printf("compressed into %s", filen); readcont(filen); free(filen); } void help(char *v){ printf("\n\n----------------------RLDZIP----------------------\n\nUsage: \n\n Archive n files: \n\n%s $file[1] $file[2] ... $file[n] $output -a\n\nExample:\n%s a.txt b.txt c.txt output -a\n\n\n\nView files:\n\n %s $file.rldzip -v\n\nExample:\n %s fileE.rldzip -v\n\n", v, v, v, v); } void runoptions(int c, char **v){ int i; if(c < 2){ printf("Arguments missing! Use -h for help"); } else{ for(i=0; i<c; i++) if(strcmp(v[i], "-h") == 0){ help(v[0]); exit(2); } for(i=0; i<c; i++) if(strcmp(v[i], "-v") == 0){ if(c != 3){ printf("Arguments misused! Use -h for help"); exit(2); } else { printf("-%s-", v[1]); readcont(v[1]); } } } if(strcmp(v[c-1], "-a") == 0) archive(c-2, v); } main(int argc, char **argv) { runoptions(argc, argv); }

    Read the article

  • Using a CMYK PSD without Photoshop

    - by 64BitBob
    I have run into a common, yet difficult problem. I do not use Photoshop for image manipulation. Since all my work is web-based, GIMP does what I need in 99% of the situations. The problem is that I occasionally receive PSD files with CMYK encoding rather than RGB encoding. These files will not open in GIMP, nor will they convert in ImageMagick. Has anyone found a good solution for converting CMYK files to RGB files (either PSD format or a flat format like PNG) that does not involve the use of Photoshop? Say a plug-in for GIMP or a standalone utility?

    Read the article

  • Protect flash video from download/right protect

    - by Smyles
    Is it possible to protect flv files from download? I'd like to protect my files from download but I don't have the money for a streaming server which I think provides some sort of protection. The files are streamed via PHP and are located in an upload folder on my server. I've used PHP to ensure that only subscribers can view the video but I basically want to go a step further and prevent subscribers from, upon login, downloading my videos with downloaders such as Sothink Flv Downloader for Firefox.

    Read the article

  • Multiple SessionFactories in Windows Service with NHibernate

    - by Rob Taylor
    Hi all, I have a Webapp which connects to 2 DBs (one core, the other is a logging DB). I must now create a Windows service which will use the same business logic/Data access DLLs. However when I try to reference 2 session factories in the Service App and call the factory.GetCurrentSession() method, I get the error message "No session bound to current context". Does anyone have a suggestion about how this can be done? public class StaticSessionManager { public static readonly ISessionFactory SessionFactory; public static readonly ISessionFactory LoggingSessionFactory; static StaticSessionManager() { string fileName = System.Configuration.ConfigurationSettings.AppSettings["DefaultNHihbernateConfigFile"]; string executingPath = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().GetName().CodeBase); fileName = executingPath + "\\" + fileName; SessionFactory = cfg.Configure(fileName).BuildSessionFactory(); cfg = new Configuration(); fileName = System.Configuration.ConfigurationSettings.AppSettings["LoggingNHihbernateConfigFile"]; fileName = executingPath + "\\" + fileName; LoggingSessionFactory = cfg.Configure(fileName).BuildSessionFactory(); } } The configuration file has the setting: <property name="current_session_context_class">call</property> The service sets up the factories: private ISession _session = null; private ISession _loggingSession = null; private ISessionFactory _sessionFactory = StaticSessionManager.SessionFactory; private ISessionFactory _loggingSessionFactory = StaticSessionManager.LoggingSessionFactory; ... _sessionFactory = StaticSessionManager.SessionFactory; _loggingSessionFactory = StaticSessionManager.LoggingSessionFactory; _session = _sessionFactory.OpenSession(); NHibernate.Context.CurrentSessionContext.Bind(_session); _loggingSession = _loggingSessionFactory.OpenSession(); NHibernate.Context.CurrentSessionContext.Bind(_loggingSession); So finally, I try to call the correct factory by: ISession session = StaticSessionManager.SessionFactory.GetCurrentSession(); Can anyone suggest a better way to handle this? Thanks in advance! Rob

    Read the article

  • Array Sorting Question for News System

    - by lemonpole
    Hello all. I'm currently stuck trying to figure out how to sort my array files. I have a simple news posting system that stores the content in seperate .dat files and then stores them in an array. I numbered the files so that my array can sort them from lowest number to greatest; however, I have run into a small problem. To begin here is some more information on my system so that you can understand it better. The function that gathers my files is: function getNewsList() { $fileList = array(); // Open the actual directory if($handle = opendir(ABSPATH . ADMIN . "data")) { // Read all file from the actual directory while($file = readdir($handle)) { if(!is_dir($file)) { $fileList[] = $file; } } } // Return the array. return $fileList; } On a seperate file is the programming that processes the news post. I didn't post that code for simplicity's sake but I will explain how the files are named. The files are numbered and the part of the post's title is used... for the numbering I get a count of the array and add "1" as an offset. I get the title of the post, encode it to make it file-name-friendly and limit the amount of text so by the end of it all I end up with: // Make the variable that names the file that will contain // the post. $filename = "00{$newnumrows}_{$snipEncode}"; When running print_r on the above function I get: Array ( [0] => 0010_Mira_mi_Soledad.dat [1] => 0011_WOah.dat [2] => 0012_Sinep.dat [3] => 0013_Living_in_Warfa.dat [4] => 0014_Hello.dat [5] => 001_AS.dat [6] => 002_ASASA.dat [7] => 003_SSASAS.dat ... [13] => 009_ASADADASADAFDAF.dat ) And this is how my content is displayed. For some reason according to the array sorting 0010 comes before 001...? Is there a way I can get my array to sort 001 before 0010?

    Read the article

  • Folder Browser Dialog.

    - by Harikrishna
    I am using Folder Browser Dialog in my application to select a folder. Now I want such a thing that in the folder there should be only html files nothing else to be selected. Like if we have open file dialog and only we want to display html file then we use filter property of openfiledialog.So how can I do that in folder browser dialog to remain or select only html files in the folder ? That is how can I filter files in the folder browser dialog ?

    Read the article

  • Windows Azure: Creating a subdirectories inside the blob

    - by veda
    I wanted to create some subdirectories inside my blob. But it is not working out well Here is my code protected void ButUpload_click(object sender, EventArgs e) { // store upladed file as a blob storage if (uplFileUpload.HasFile) { name = uplFileUpload.FileName; // get refernce to the cloud blob container CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents"); if (textbox.Text != "") { name = textbox.Text + "/" + name; } // set the name for the uploading files string UploadDocName = name; // get the blob reference and set the metadata properties CloudBlockBlob blob = blobContainer.GetBlockBlobReference(UploadDocName); blob.Metadata["FILETYPE"] = "text"; blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType; // upload the blob to the storage blob.UploadFromStream(uplFileUpload.FileContent); } } What I did is that, If I have to create a sub directory, I will enter the name of the sub directory in the textbox. for example, if I need to create a file named "test.txt" inside the sub directory "files" Then, my textbox.text = files and uplFileUpload.FileName = test.txt Now I will concatenate them and upload to the blob.. But it is not working well.. I am getting just https://test.core.windows.net/documents/files/ I am not getting the entire thing I was expecting https://test.core.windows.net/documents/files/test.txt What am I doing wrong... How to create sub directories inside the blob.

    Read the article

  • Fastest way to deploy rails apps with Passenger

    - by yuval
    I am working on a Dreamhost server with Rails 2.3.5. Every time I make changes to a site, I have to ssh into the site, remove all the files, upload a zip file containing all the new files for the site, unzip that file, migrate the database, and go. Something tells me there's a faster way to deploy rails apps. I am using mac Time Machine to keep track of different versions of my applications. I know git tracks files, but I don't really know how to work with it to deploy my applications, since passenger takes care of all the magic for me. What would be a faster way to deploy my applications (and avoid the downtime associated with my method when I delete all files on the server)? Thanks!

    Read the article

< Previous Page | 488 489 490 491 492 493 494 495 496 497 498 499  | Next Page >