Search Results

Search found 37684 results on 1508 pages for 'msp files'.

Page 59/1508 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • Zipping only files using powershell

    - by SteB
    I'm trying to zip all the files in a single directory to a different folder as part of a simple backup routine. The code runs ok but doesn't produce a zip file: $srcdir = "H:\Backup" $filename = "test.zip" $destpath = "K:\" $zip_file = (new-object -com shell.application).namespace($destpath + "\"+ $filename) $destination = (new-object -com shell.application).namespace($destpath) $files = Get-ChildItem -Path $srcdir foreach ($file in $files) { $file.FullName; if ($file.Attributes -cne "Directory") { $destination.CopyHere($file, 0x14); } } Any ideas where I'm going wrong?

    Read the article

  • Is there a word processor similar to MS Word which saves files as readable txt files?

    - by zenbomb
    I'm writing a paper together with my supervisor and would like to have a more sophisticated version control than *_291112_NEW_NEW_revised1.doc files. My supervisor is a non-computer person will never ever use LaTeX or git and loves MS Word. I'm therefore looking for an alternative to Word (I need commenting on text passages!) which stores the files as clean text (Markup for formating is fine), so I'm able to put them under version control on my side. I'm aware that git can also handle binary files, but I'd prefer the cleaner way of looking at the contents directly. If there's a way to automatically extract the text from word files, I'm fine with that too for now.

    Read the article

  • Grep-ing gzipped files [duplicate]

    - by Julien Genestoux
    This question already has an answer here: Grepping through .gz log files 5 answers I have a set of 100 log files, compressed using gzip. I need to find all lines matching a given expression. I'd use grep, but of course, that's a bit of a nightmare because I'll have to unzip all files, one by one, grep them and delete the unzipped version, because they wouldn't all fit on my sevrer if they were all unzipped. Anyone has a little trick on how to get that done quickly?

    Read the article

  • Shares not working on boot need to reinstall "File and Printer Sharing for Microsoft Networks" on server every morning to fix

    - by Neaox
    I had a problem a few days ago see question here: Can no longer access computer or network shares to my server from any other computers on the network The fix that I found does in fact work however when I boot my PC in the morning the shares are no longer working, to fix this I need to remote desktop into the server and re-install "File and Printer Sharing for Microsoft Networks" on the main adaptor. Doing this makes the shares work again, however it is anoying to have to do this each and every morning. On top of this my offline files are no longer available offline: I store my user profile on the server and had them selected to be "Always Available" however since this has happened they are no longer available offline and the option to make them available offline from the context menu is no longer available. Another problem, and I don't know if this is the cause or just a tell of a deeper issue but this server runs hyper-v, since these problems I can no longer remote desktop into the hyper-v client. Thanks for any help anyone can give.

    Read the article

  • Adding items to a files right click menu in Windows Explorer

    - by Fire Lancer
    What do I need to do to add an item to the right click menu for files with certain file extensions, along with sub menus? An example would be adding items to run Python files (.py, .pyw, .pyc) with a specific version of Python, so the menu for a .py files would look like say: Open 7-Zip > ...7zip stuff Run > Python 2.5 Python 2.6 Python 3.1 Edit > IDLE 2.5 IDLE 2.6 IDLE 3.1 various other items

    Read the article

  • Updating shared files across computers

    - by murgatroid99
    I have a file server running Windows Server 2008 and a couple of laptops running Windows 7 on a network. There are a large number of files that all users will need access to. My plan is to have the files on both the server and the laptops because the users will need to access the files in places with no Internet access. I also want any changes made to the files on any of the laptops to propagate to the server and then propagate to the other laptops whenever they connect to the network. Should I do this with a scheduled batch script with a few xcopy commands or is there a better way to do it?

    Read the article

  • Extract structured data from many MS Word files

    - by Mark
    I have ~160 MS Word files that contain structured data. The data is formatted identically across all files and resides in a tabular format. I'd like to extract the data into a database, XML or just an aggregate table without opening each of the files independently. Is there a tool or method I can use to extract this data?

    Read the article

  • Flash Player is creating thousands of .tmp files

    - by Ed Manet
    We have seen a number of machines in our environment (XP Pro SP3) that have been running out of disk space because of .TMP files in the windows\temp folder. One machine had 6GB of .TMP files on it starting from around August 2010. The files are all 305kb in size and they seem to get created every 10 minutes. The files appear to be either .EXEs or .DLLs when opened in a hex editor. The words "this program can not be run in DOS mode" are at the beginning of the file and the words "Adobe Flash Player" are scattered all over the end of the file (probably the string table). While it's easy enough to clean them up, I'd like to find root cause for the issue. Has anybody else seen this?

    Read the article

  • Windows log file monitor that supports custom events (eg. sending an email when it detects the string "ERROR")

    - by ilitirit
    I know this question has been asked several times before but I can't seem to find a solution for my requirements. I currently use BareTail, which works wonderfully except that it doesn't support custom events besides line highlighting. I'm also trying TailForWin32. It has a SMTP plugin but it seems to be in beta status, and the highlighting seems limited. It also doesn't handle rolling log files very well (a blocking dialog box pops up, whereas BareTail just rolls over naturally). All I really need is something like BareTail that supports custom events. First prize would be a tool with a plugin-based architecture so I can use my own messaging plugins, but anything that supports SMTP mail would be fine as well.

    Read the article

  • Linux: Create files and direcotires but not delete them

    - by Peraz
    I have a process that create directories and files inside a working directory, ex: /workingdir/file1 /workingdir/file2 /workingdir/dir1/file1 /workingdir/dir1/dir2/file1 /workingdir/dir1/file2 I need to avoid deletion/overwrites of created folders/files for that user, but allow subsequent folders/subfolders/files creation. I try permissions, gid, acl with no luck. What is the correct way to do that ? (i can use a cron job if needed)

    Read the article

  • Editing files on linux server from windows

    - by celicni
    I want to edit text files on Linux server from Windows, using text editor like Notepad++ or UltraEdit. I've managed to do so using WinSCP. It can edit files remotely and offers me to choose local application to open those files. That is exactly what I need, but when I hit ctrl-s (not every time, in about 50% cases), it waits for around 10 seconds, alerts that connection has failed, and offers me to "abort". When I click abort, it instantly reconnect and save file. So does anybody knows any other way to edit files remotely without this annoying waiting period.

    Read the article

  • Moving files within an ext4 filesystem?

    - by HT74
    I'd like to decrease the access-time for some files by moving them to the beginning of the fs. Task 1: Clear a certain block range at the beginning of the fs (moving existing files to free space elsewhere). Task 2: Move the files in question to that block range (should be able to grow a bit). How would I do that?

    Read the article

  • How do I open 2 instances of the same file in notepad++ side by side with their own scrollbars in a single Notepad++ window?

    - by Qlidnaque
    I remember doing this a long time ago and have forgotten how I had done it. I like to do this when I have long html or php files to edit and I need part of the code from further down the file in a place nearer to the top, or when I want to compare different parts of the same file. There was a way to do this without opening two instances of Notepad++ and when I clicked on save, it made the saved changes in both instances of the opened file (whereas if I have 2 windows of Notepad++ opened simultaneously, it will prompt me to either update or not update the second opened instance if the first one was saved midway.)

    Read the article

  • Missed something? Cant upload files to server (permissions)

    - by Camran
    I can upload files as "root" to the Ubuntu server. Then I created a user (me). Next I added the user to the group www-data. Then assigned rwx permissions to www-data. Next, when I try to upload, delete or modify files VIA FILEZILLA, I cant. But via the terminal, I can change files using sudo command. What should I do to be able to upload files without getting the "permission denied" in filezilla? If you need more input let me know. Thanks

    Read the article

  • Binary diff/patch for large files on linux?

    - by thejh
    I've got two partition images (A and B) and want to use them to create a patch that I can apply on A on another computer in order to get the new B image without flooding the network. I have the following requirements: works on linux can create diffs can use diffs to patch files can handle binary files can handle large files (a few hundred GB should work) no user interaction required (just a console application) ideally, should be able to read from/write to pipes (so that I can pipe into it from a gzip-compressed file and write to one) Does something like that exist?

    Read the article

  • Automate Monitor string in different log files

    - by EVIA
    I have few log files in different servers and I want to check output in the end of those log files for e.g . success: 4000 failed: 200 These logs files are getting generated daily and I have to keep track of these numbers. If there is any way I can automate this option instead of going and checking these files and wasting so much of my time. I want to create some kind of script like Go to \serverA\C$\log_07_02_2012.txt and check this line Go to \serverB\C$\log_07_02_2012.txt and check some other line. .... and it should give me output from all of these...

    Read the article

  • Parsing flat files using SSIS : SSIS Nugget

    - by jamiet
    Often when using SQL Server Integration Services (SSIS) you will find there is more than one way of accomplishing a task and that the most obvious method of doing so might not be the optimal one. In the video below I demonstrate this by way of an experiment using SSIS’s Flat File Source component; I show different ways that you can pull data from a flat file into the SSIS dataflow and also how the nature of the data itself can influence your choice as to how this task should be accomplished. If you are having trouble viewing the video in your blog reader then head to http://sqlblog.com/blogs/jamie_thomson/archive/2010/03/25/parsing-flat-files-using-ssis-ssis-nugget.aspx to see it as it is hosted on my blog!  The main point I want to get across from this video is that a little bit of creative thinking when building your dataflows can sometimes be very beneficial for performance; quite often building a solution that isn’t the most obvious might actually turn out to be the best one. You’ll notice, if you have watched the video, that my editing skills weren’t quite up to snuff and I cut off the final few words however all I was saying was that if you have any feedback on this video then I would love to hear it either via email or preferably the comments section below. I hope this turns out to be useful to some of you. @Jamiet P.S. Incidentally the parsing that we do using SSIS expressions in the video would be much easier if we had a TOKENISE function in SSIS’s expression language and I have asked for the introduction of such a function on Connect at [SSIS] TOKEN(string, tokeniser_string, occurence) function. Feel free to go and vote that up if you think this feature would be useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SQL SERVER – Reduce the Virtual Log Files (VLFs) from LDF file

    - by pinaldave
    Earlier, I wrote a quite note on SQL SERVER – Detect Virtual Log Files (VLF) in LDF. Because of this I got responses suggesting too many VLFs are bad for log file. This prompts to a simple question: “How many is ‘too many’ VLFs?” I suggest that you go and read an article written by Kimberly over here. I am sure that you are going to have a clear understanding of what a good number for your VLFs is from that article. If you have lots of VLFs, you can reduce them right away using the following method: (I am just attempting to write a working script over here.) USE AdventureWorks GO BACKUP LOG AdventureWorks TO DISK='d:\adtlog.bak' GO -- Get Logical file name of the log file sp_helpfile GO DBCC SHRINKFILE(AdventureWorks_Log,TRUNCATEONLY) GO ALTER DATABASE AdventureWorks MODIFY FILE (NAME = AdventureWorks_Log,SIZE = 1GB) GO DBCC LOGINFO GO Again, here I have assumed that your initial log size is 1 GB, but in reality you should select the number based on your own ideal size of the log file. If your log file grows to 10 GB every day, you may want to put the value as 10 GB. For accuracy, read what Kimberly’s original article says over here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Fixing up Configurations in BizTalk Solution Files

    - by Elton Stoneman
    Just a quick one this, but useful for mature BizTalk solutions, where over time the configuration settings can get confused, meaning Debug configurations building in Release mode, or Deployment configurations building in Development mode. That can cause issues in the build which aren't obvious, so it's good to fix up the configurations. It's time-consuming in VS or in a text editor, so this bit of PowerShell may come in useful - just substitute your own solution path in the $path variable: $path = 'C:\x\y\z\x.y.z.Integration.sln' $backupPath = [System.String]::Format('{0}.bak', $path) [System.IO.File]::Copy($path, $backupPath, $True) $sln = [System.IO.File]::ReadAllText($path)   $sln = $sln.Replace('.Debug|.NET.Build.0 = Deployment|.NET', '.Debug|.NET.Build.0 = Development|.NET') $sln = $sln.Replace('.Debug|.NET.Deploy.0 = Deployment|.NET', '.Debug|.NET.Deploy.0 = Development|.NET') $sln = $sln.Replace('.Debug|Any CPU.ActiveCfg = Deployment|.NET', '.Debug|Any CPU.ActiveCfg = Development|.NET') $sln = $sln.Replace('.Deployment|.NET.ActiveCfg = Debug|Any CPU', '.Deployment|.NET.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Any CPU.ActiveCfg = Debug|Any CPU', '.Deployment|Any CPU.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Any CPU.Build.0 = Debug|Any CPU', '.Deployment|Any CPU.Build.0 = Release|Any CPU') $sln = $sln.Replace('.Deployment|Mixed Platforms.ActiveCfg = Debug|Any CPU', '.Deployment|Mixed Platforms.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Deployment|Mixed Platforms.Build.0 = Debug|Any CPU', '.Deployment|Mixed Platforms.Build.0 = Release|Any CPU') $sln = $sln.Replace('.Deployment|.NET.ActiveCfg = Debug|Any CPU', '.Deployment|.NET.ActiveCfg = Release|Any CPU') $sln = $sln.Replace('.Debug|.NET.ActiveCfg = Deployment|.NET', '.Debug|.NET.ActiveCfg = Development|.NET')   [System.IO.File]::WriteAllText($path, $sln) The script creates a backup of the solution file first, and then fixes up all the configs to use the correct builds. It's a simple search and replace list, so if there are any patterns that need to be added let me know and I'll update the script. A RegEx replace would be neater, but when it comes to hacking solution files, I prefer the conservative approach of knowing exactly what you're changing.

    Read the article

  • Central Ohio Day of .Net 2010 Slides and Files

    - by Brian Jackett
    This weekend I presented my “The Power of PowerShell + SharePoint 2007” session at the Central Ohio Day of .Net conference in Wilmington, OH this weekend.  This is the second year I’ve attended this conference, first time as a presenter.  For those unfamiliar Day of .Net conferences are a one-day conference on all things .NET organized by developers for developers.  These events are usually offered at no cost to anyone interested in .NET development.     The attendees of my session had some great questions and I hope they all got something worthwhile out of it.  Below are my slides and demo scripts (some of which I didn’t have time to demo) along with my sample profiles.  If you have any questions, comments, or feedback feel free to leave comments here or send me an email at [email protected].   Slides and Files SkyDrive link   Technology and Friends Interview Experience     On a side note, any of you familiar with one of my Sogeti co-workers in Detroit David Giard may know that he hosts a web series called Technology and Friends.  After my session David tracked me down and asked to interview me about PowerShell.  I was happy to oblige so we sat down and taped some material.  I don’t know when that interview will be going live, but look for it on www.davidgiard.com.   Conclusion     A big thanks goes out to all of the sponsors, speakers, and attendees for the Central Ohio Day of .Net conference.  Without all of them this conference couldn’t have been possible.  I had a great time at the conference and look forward to coming back next year whether that is as a speaker or attendee.       -Frog Out

    Read the article

  • Deploying war files in tomcat6

    - by user3215
    I am using tomcat5.5 for a long on ubuntu servers 8.10 and 9.10 and '/usr/share/tomcat/webapps/' is the path where I place my .war files and access them on the browser over network. On a sytem I've installed tomcat6 and I'm failing to find that where do I place my .war file of tomcat6's webapps. I checked deploying war under /var/lib/tomcat6/webapps/ and the war file is extracted and I think this should be the location but I could not access the page when I tried http://serverip:8080/myapp. I could access the default page properly when I navigate to http://serverip:8080. The same war file is working fine on tomcat servers which were not installed from apt repository. Log Messages: INFO: Stopping Coyote HTTP/1.1 on http-8080 2 Dec, 2010 10:06:29 AM org.apache.coyote.http11.Http11Protocol init INFO: Initializing Coyote HTTP/1.1 on http-8080 2 Dec, 2010 10:06:29 AM org.apache.catalina.startup.Catalina load INFO: Initialization processed in 523 ms 2 Dec, 2010 10:06:29 AM org.apache.catalina.core.StandardService start INFO: Starting service Catalina 2 Dec, 2010 10:06:29 AM org.apache.catalina.core.StandardEngine start INFO: Starting Servlet Engine: Apache Tomcat/6.0.20 2 Dec, 2010 10:06:30 AM org.apache.catalina.startup.HostConfig deployWAR INFO: Deploying web application archive myapp.war 2 Dec, 2010 10:06:32 AM org.apache.catalina.core.StandardContext start SEVERE: Error listenerStart 2 Dec, 2010 10:06:32 AM org.apache.catalina.core.StandardContext start SEVERE: Context [/myapp] startup failed due to previous errors 2 Dec, 2010 10:06:32 AM org.apache.coyote.http11.Http11Protocol start INFO: Starting Coyote HTTP/1.1 on http-8080 2 Dec, 2010 10:06:32 AM org.apache.catalina.startup.Catalina start INFO: Server startup in 3110 ms 2 Dec, 2010 10:06:39 AM org.apache.catalina.core.StandardContext start SEVERE: Error listenerStart 2 Dec, 2010 10:06:39 AM org.apache.catalina.core.StandardContext start SEVERE: Context [/myapp] startup failed due to previous error Any help?

    Read the article

  • Java issues on OpenVZ Ubuntu 11.04 (.jar/.sh files)

    - by IWillNotChange
    I've had a whole line of messes with java and .jar files. I've tried both OpenJDK (from software installer) and about three repositories for Sun. /Desktop# java -jar -Xmx1024m ss.jar Exception in thread "main" java.awt.HeadlessException at java.awt.GraphicsEnvironment.checkHeadless(GraphicsEnvironment.java:173) at java.awt.Window.<init>(Window.java:476) at java.awt.Frame.<init>(Frame.java:419) at java.awt.Frame.<init>(Frame.java:384) at javax.swing.JFrame.<init>(JFrame.java:174) at org.powerbot.bd.<init>(Unknown Source) at org.powerbot.Boot.main(Unknown Source) Two separate errors: ~/Desktop# ./ss.sh [SEVERE] org.server.Boot: Default heap size of 490m too small, restarting with 768m and about 30 different crashes were it just "aborts" with a huge file dump. Each time I've tried something a little different, whether it be updating Java or just changing -Xmx1024 to -Xmx1024m to get rid of the heap. Personally I think it has something to do with OpenVZ, but Google hasn't saved me this time, I need someone who can get to the bottom of my problem. java -version java version "1.6.0_26" Java(TM) SE Runtime Environment (build 1.6.0_26-b03) Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode) is my current install. Running ss.sh gives me: (I'd post the entire log but its long) # # A fatal error has been detected by the Java Runtime Environment: # # SIGILL (0x4) at pc=0x00002b14278e6fa0, pid=9301, tid=47365590714112 # # JRE version: 6.0_26-b03 # Java VM: Java HotSpot(TM) 64-Bit Server VM (20.1-b02 mixed mode linux-amd64 compressed oops) # Problematic frame: # C [ld-linux-x86-64.so.2+0x14fa0] _dl_make_stack_executable+0x2b50 # # If you would like to submit a bug report, please visit: # http://java.sun.com/webapps/bugreport/crash.jsp # The crash happened outside the Java Virtual Machine in native code. # See problematic frame for where to report the bug. # I'm willing to let someone who knows what they are talking about view it and try and sort this out. Any help would be appreciated, I've about pulled all my hair Googling to no avail.

    Read the article

  • How to join video files from terminal?

    - by Leon Vitanos
    I have tried avidemux2_cli, mencoder, ffmpeg, cat.. But this doesn't always work (With the most of the times the error is that the audio codec is not the same) Maybe i put wrong options in the commands. So the commands: cat Sample.avi rrr.avi > complete.avi ffmpeg -i Sample.avi -i output.avi -vcodec copy -acodec copy complete.avi mencoder -ovc lavc -oac copy Sample.avi rrr.avi -o complete.avi avidemux2_cli --audio-codec copy --video-codec copy --output-format avi --load Sample.avi -append output.avi --save video.avi The cat problem is that it doesn't show error but it doesn't work always..Like the complete.avi will be exactly the same with Sample.avi Fmmpeg does nothing. The complete.avi is always the same with Sample.avi Mencoder error: All files must have identical audio codec and format for -oac copy. So the complete.avi is the same with Sample.avi avidemux2_cli there is no error but the complete.avi is again the same with Sample.avi.. So to sum up, all complete.avi are the same with Sample.avi.. And the problem is that they don't have the same audio codec ( i quess ).. Any ideas?

    Read the article

  • updatedb & locate command problem - Files from external hard drive are no longer indexed after rebooting

    - by user784637
    Files from my external hard drive are no longer indexed after rebooting. I have to remount and then run # updatedb after each reboot. The problem is updatedb takes a few minutes for my external hard drives. Is there any way I can retain indexing for my externals after I reboot so that the locate command can search through my externals? EDIT: Per Request here are my specs: $ cat /etc/updatedb.conf PRUNE_BIND_MOUNTS="yes" # PRUNENAMES=".git .bzr .hg .svn" PRUNEPATHS="/tmp /var/spool /media" PRUNEFS="NFS nfs nfs4 rpc_pipefs afs binfmt_misc proc smbfs autofs iso9660 ncpfs coda devpts ftpfs devfs mfs shfs sysfs cifs lustre_lite tmpfs usbfs udf fuse.glusterfs fuse.sshfs ecryptfs fusesmb devtmpfs" # mount /dev/sda5 on / type ext4 (rw,errors=remount-ro) proc on /proc type proc (rw,noexec,nosuid,nodev) none on /sys type sysfs (rw,noexec,nosuid,nodev) none on /sys/fs/fuse/connections type fusectl (rw) none on /sys/kernel/debug type debugfs (rw) none on /sys/kernel/security type securityfs (rw) none on /dev type devtmpfs (rw,mode=0755) none on /dev/pts type devpts (rw,noexec,nosuid,gid=5,mode=0620) none on /dev/shm type tmpfs (rw,nosuid,nodev) none on /var/run type tmpfs (rw,nosuid,mode=0755) none on /var/lock type tmpfs (rw,noexec,nosuid,nodev) none on /lib/init/rw type tmpfs (rw,nosuid,mode=0755) binfmt_misc on /proc/sys/fs/binfmt_misc type binfmt_misc (rw,noexec,nosuid,nodev) gvfs-fuse-daemon on /home/me/.gvfs type fuse.gvfs-fuse-daemon (rw,nosuid,nodev,user=me) /dev/sdb1 on /media/me type fuseblk (rw,nosuid,nodev,allow_other,blksize=4096,default_permissions) /dev/sdd1 on /media/Little Boy type fuseblk (rw,nosuid,nodev,allow_other,blksize=4096,default_permissions) /dev/sde1 on /media/Fat Man type fuseblk (rw,nosuid,nodev,allow_other,blksize=4096,default_permissions) # on_ac_power; echo $? 255

    Read the article

  • Versioning and Continuous Integration with project settings files

    - by Michael Stephenson
    I came across something which was a bit of a pain in the bottom the other week. Our scenario was that we had implemented a helper style assembly which had some custom configuration implemented through the project settings. I'm sure most of you are familiar with this where you end up with a settings file which is viewable through the C# project file and you can configure some basic settings. The settings are embedded in the assembly during compilation to be part of a DefaultValue attribute. You have the ability to override the settings by adding information to your app.config and if the app.config doesn’t override the settings then the embedded default is used. All normal C# stuff so far… Where our pain started was when we implement Continuous Integration and we wanted to version all of this from our build. What I was finding was that the assembly was versioned fine but the embedded default value was maintaining the non CI build version number. I ended up getting this to work by using a build task to change the version numbers in the following files: App.config Settings.settings Settings.designer.cs I think I probably could have got away with just the settings.designer.cs, but wanted to keep them all consistent incase we had to look at the code on the build server for some reason. I think the reason this was painful was because the settings.designer.cs is only updated through Visual Studio and it writes out the code to this file including the DefaultValue attribute when the project is saved rather than as part of the compilation process. The compile just compiles the already existing C# file. As I said we got it working, and it was a bit of a pain. If anyone has a better solution for this I'd love to hear it

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >