Search Results

Search found 51448 results on 2058 pages for 'log files'.

Page 65/2058 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • Save Files Directly from Your Browser to the Cloud in Chrome and Iron

    - by Asian Angel
    Are you looking for a quicker, easier way to upload files you find while browsing to your favorite cloud services? Skip saving files to your hard-drive and transfer them directly from your browser to your accounts using the Cloud Save extension. You can see the cloud services currently supported in the screenshot above and more are being added all the time. So if your favorite is not listed yet just keep checking in at the extension’s homepage. Cloud Save [Google Chrome Extensions] Latest Features How-To Geek ETC How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? Save Files Directly from Your Browser to the Cloud in Chrome and Iron The Steve Jobs Chronicles – Charlie and the Apple Factory [Video] Google Chrome Updates; Faster, Cleaner Menus, Encrypted Password Syncing, and More Glowing Chess Set Combines LEDs, Chess, and DIY Electronics Fun Peaceful Alpine River on a Sunny Day [Wallpaper] Fast Society Creates Mini and Mobile Temporary Social Networks

    Read the article

  • String patterns that can be used to filter and group files

    - by Louis Rhys
    One of our application filters files in certain directory, extract some data from it and export a document from the extracted data. The algorithm for extracting the data depends on the file, and so far we use regex to select the algorithm to be used, for example .*\.txt will be processed by algorithm A, foo[0-5]\.xml will be processed by algo B, etc. However now we need some files to be processed together. For example, in one case we need two files, foo.*\.xml and bar.*\.xml. Part of the information to be extracted exist in the foo file, and the other part in the bar file. Moreover, we need to make sure the wild card is compatible. For example, if there are 6 files foo1.xml foo23.xml bar1.xml bar9.xml bar23.xml foo4.xml I would expect foo1 and bar1 to be identified as a group, and foo23 and bar23 as another group. bar9 and foo4 has no pair, so they will not be treated. Now, since the filter is configured by user, we need to have a pattern that can express the above requirement. I don't think you can express meaning like above in standard regex. (foo|bar).*\.xml will match all 6 file above and we can't identify which file is paired for a particular file. Is there any standard pattern that can express it? Or any idea how to modify regex to support this, that can be implemented easily?

    Read the article

  • Using OData to get Mix10 files

    - by Jon Dalberg
    There has been a lot of talk around OData lately (go to odata.org for more information) and I wanted to get all the videos from Mix ‘10: two great tastes that taste great together. Luckily, Mix has exposed the ‘10 sessions via OData at http://api.visitmix.com/OData.svc, now all I have to do is slap together a bit of code to fetch the videos. Step 1 (cut a hole in the box) Create a new console application and add a new service reference. Step 2 (put your junk in the box) Write a smidgen of code: 1: static void Main(string[] args) 2: { 3: var mix = new Mix.EventEntities(new Uri("http://api.visitmix.com/OData.svc")); 4:   5: var files = from f in mix.Files 6: where f.TypeName == "WMV" 7: select f; 8:   9: var web = new WebClient(); 10: 11: var myVideos = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "Mix10"); 12:   13: Directory.CreateDirectory(myVideos); 14:   15: files.ToList().ForEach(f => { 16: var fileName = new Uri(f.Url).Segments.Last(); 17: Console.WriteLine(f.Url); 18: web.DownloadFile(f.Url, Path.Combine(myVideos, fileName)); 19: }); 20: } Step 3 (have her open the box) Compile and run. As you can see, the client reference created for the OData service handles almost everything for me. Yeah, I know there is some batch file to download the files, but it relies on cUrl being on the machine – and I wanted an excuse to work with an OData service. Enjoy!

    Read the article

  • ODI 11g – Faster Files

    - by David Allan
    Deep in the trenches of ODI development I raised my head above the parapet to read a few odds and ends and then think why don’t they know this? Such as this article here – in the past customers (see forum) were told to use a staging route which has a big overhead for large files. This KM is an example of the great extensibility capabilities of ODI, its quite simple, just a new KM that; improves the out of the box experience – just build the mapping and the appropriate KM is used improves out of the box performance for file to file data movement. This improvement for out of the box handling for File to File data integration cases (from the 11.1.1.5.2 companion CD and on) dramatically speeds up the file integration handling. In the past I had seem some consultants write perl versions of the file to file integration case, now Oracle ships this KM to fill the gap. You can find the documentation for the IKM here. The KM uses pure java to perform the integration, using java.io classes to read and write the file in a pipe – it uses java threading in order to super-charge the file processing, and can process several source files at once when the datastore's resource name contains a wildcard. This is a big step for regular file processing on the way to super-charging big data files using Hadoop – the KM works with the lightweight agent and regular filesystems. So in my design below transforming a bunch of files, by default the IKM File to File (Java) knowledge module was assigned. I pointed the KM at my JDK (since the KM generates and compiles java), and I also increased the thread count to 2, to take advantage of my 2 processors. For my illustration I transformed (can also filter if desired) and moved about 1.3Gb with 2 threads in 140 seconds (with a single thread it took 220 seconds) - by no means was this on any super computer by the way. The great thing here is that it worked well out of the box from the design to the execution without any funky configuration, plus, and a big plus it was much faster than before, So if you are doing any file to file transformations, check it out!

    Read the article

  • ODI 11g - Faster Files

    - by David Allan
    Deep in the trenches of ODI development I raised my head above the parapet to read a few odds and ends and then think why don’t they know this? Such as this article here – in the past customers (see forum) were told to use a staging route which has a big overhead for large files. This KM is an example of the great extensibility capabilities of ODI, its quite simple, just a new KM that; improves the out of the box experience – just build the mapping and the appropriate KM is used improves out of the box performance for file to file data movement. This improvement for out of the box handling for File to File data integration cases (from the 11.1.1.5.2 companion CD and on) dramatically speeds up the file integration handling. In the past I had seem some consultants write perl versions of the file to file integration case, now Oracle ships this KM to fill the gap. You can find the documentation for the IKM here. The KM uses pure java to perform the integration, using java.io classes to read and write the file in a pipe – it uses java threading in order to super-charge the file processing, and can process several source files at once when the datastore's resource name contains a wildcard. This is a big step for regular file processing on the way to super-charging big data files using Hadoop – the KM works with the lightweight agent and regular filesystems. So in my design below transforming a bunch of files, by default the IKM File to File (Java) knowledge module was assigned. I pointed the KM at my JDK (since the KM generates and compiles java), and I also increased the thread count to 2, to take advantage of my 2 processors. For my illustration I transformed (can also filter if desired) and moved about 1.3Gb with 2 threads in 140 seconds (with a single thread it took 220 seconds) - by no means was this on any super computer by the way. The great thing here is that it worked well out of the box from the design to the execution without any funky configuration, plus, and a big plus it was much faster than before, So if you are doing any file to file transformations, check it out!

    Read the article

  • Upgrade to 0.25, files served to uPNP devices cannot play

    - by David Buttrick
    I have a Sony BDP-S390 bluray & network player. I upgraded my Myth server to 0.25. When I browse to the Myth server, and try to play a recording, I get an error message about the file not being payable in the player. Interestingly, the files that i have recorded, and the videos that I have loaded into my Video volume group are .mpg or .mp4. The player shows the filetype that it thinks the file is in it's list, and it claims that these files are AVI files, however none of them are. They are all .mp4 or .mpg files. Thinking that that was just an optical illusion, I went ahead and tried to play a file, but I get an error about the file not being playable. First of all, is there something that I need to do to make the uPNP server know about different filetypes? Is it reporting AVI because it hasn't been told about MPG or MP4? Second, I'd like to help out some more here and collect some logging about the uPNP server in the myth server. I cant seem to find information on how to turn on logging, and there is no mythbackend settings file int /etc/default. Thanks very much.

    Read the article

  • All in a Day's Work: Unblocking Multiple Downloaded Files with a Single Command

    - by Sam Abraham
    Files downloaded using Internet Explorer retain Internet Zone permission level and hence are “Blocked” by default on Windows 7 machines. Honestly, while an added overhead for developers; I really appreciate this feature as it provides a good protection layer for casual web users. My workaround is to simply unblock the downloaded zip file (if download was a zip file) which, in turn, unblocks the files stored within. Today however, I was left with a situation where I had to “Open” and “Copy” the content rather than “Save” a zip file. That of course left me with a few dozen files I have to manually unblock. A few minutes of internet search lead me to the link below which worked like a charm: 1-Download streams.exe from SystInternals - http://technet.microsoft.com/en-us/sysinternals/bb897440.aspx 2-Go to command prompt (cmd.exe) 3-Navigate to where you have streams.exe installed 4-Use command line switches: streams.exe –s –d “<folder path>” This removed the Internet Zone restrictions from all files under “<folder path>” and its subfolders as well. [Deleted :Zone.Identifier:$DATA] References: http://social.technet.microsoft.com/Forums/en-US/itproxpsp/thread/806f0104-1caa-4a66-b504-7a681d1ccb33/

    Read the article

  • Generating HTML Help files based on XML documentation

    - by geekrutherford
    Since discovering the XML commenting features built into .NET years ago I have been using it to help make my code more readable and simpler for other developers to understand exactly what the code is doing. Entering /// preceding a line of code causes Visual Studio to insert "summary" tags.  It also results in additional tags being generated if you are commenting a method with parameters and a return type. I already knew that Intellisense would pick up these comments and display them when coding and selecting properties, methods, etc. from a class.  I also knew that you could set Visual Studio to generate an XML file containing said comments.  Only recently did I begin to wonder if I could generate some kind of readable help files based on these comments I so diligently added. After searching the web I came across NDoc, an open source project which creates documentation for you based on the XML files generated by Visual Studio.  Unfortunately, NDoc has become stale and no longer supported (last release was back in 2005). Fortunately there is a little known tool from Microsoft themselves called "Sandcastle Help File Builder".  This nifty little tool gives you a graphical interface that allows you to specify multiple DLL and XML files from which to generate a MSDN like HTML Help File for your own projects! You can check it out here: http://shfb.codeplex.com/ If you are curious how to set Visual Studio to generate the above reference XML documentation files simply go to your projects property page and edit as shown below (my paths are specific, you can leave yours at the default values):

    Read the article

  • not able to write log file to specified path(Microsoft Enterprise Library Logging)

    - by prince23
    hi, i am trying to implement the logging Configuration using Microsoft Enterprise Library Logging class Program { static void Main(string[] args) { LogEntry entry = new LogEntry() { Message = "Hello Ent. Lib. Logging" }; Logger.Write(entry); } } here the entry conyain she details. but these details are not written in the file i web config i have the settings done like this. <?xml version="1.0" encoding="utf-8"?> <configuration> <configSections> <section name="loggingConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.LoggingSettings, Microsoft.Practices.EnterpriseLibrary.Logging, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> <section name="exceptionHandling" type="Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.Configuration.ExceptionHandlingSettings, Microsoft.Practices.EnterpriseLibrary.ExceptionHandling, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> <section name="dataConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Data.Configuration.DatabaseSettings, Microsoft.Practices.EnterpriseLibrary.Data, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> </configSections> <loggingConfiguration name="Logging Application Block" tracingEnabled="true" defaultCategory="General" logWarningsWhenNoCategoriesMatch="true"> <listeners> <add fileName="C:\Inetpub\wwwroot\trace.log" header="----------------------------------------" footer="----------------------------------------" formatter="" listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.FlatFileTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" traceOutputOptions="None" filter="All" type="Microsoft.Practices.EnterpriseLibrary.Logging.TraceListeners.FlatFileTraceListener, Microsoft.Practices.EnterpriseLibrary.Logging, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="FlatFile TraceListener" /> <add source="Enterprise Library Logging" formatter="Text Formatter" log="Application" machineName="" listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.FormattedEventLogTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" traceOutputOptions="None" filter="All" type="Microsoft.Practices.EnterpriseLibrary.Logging.TraceListeners.FormattedEventLogTraceListener, Microsoft.Practices.EnterpriseLibrary.Logging, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="Formatted EventLog TraceListener" /> </listeners> <formatters> <add template="Timestamp: {timestamp}&#xD;&#xA;Message: {message}&#xD;&#xA;Category: {category}&#xD;&#xA;Priority: {priority}&#xD;&#xA;EventId: {eventid}&#xD;&#xA;Severity: {severity}&#xD;&#xA;Title:{title}&#xD;&#xA;Machine: {machine}&#xD;&#xA;Application Domain: {appDomain}&#xD;&#xA;Process Id: {processId}&#xD;&#xA;Process Name: {processName}&#xD;&#xA;Win32 Thread Id: {win32ThreadId}&#xD;&#xA;Thread Name: {threadName}&#xD;&#xA;Extended Properties: {dictionary({key} - {value}&#xD;&#xA;)}" type="Microsoft.Practices.EnterpriseLibrary.Logging.Formatters.TextFormatter, Microsoft.Practices.EnterpriseLibrary.Logging, Version=4.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="Text Formatter" /> </formatters> <categorySources> <add switchValue="All" name="General"> <listeners> <add name="Formatted EventLog TraceListener" /> </listeners> </add> </categorySources> <specialSources> <allEvents switchValue="All" name="All Events" /> <notProcessed switchValue="All" name="Unprocessed Category" /> <errors switchValue="All" name="Logging Errors &amp; Warnings"> <listeners> <add name="Formatted EventLog TraceListener" /> </listeners> </errors> </specialSources> </loggingConfiguration> <exceptionHandling> <exceptionPolicies> <add name="Exception Policy"> <exceptionTypes> <add type="System.Exception, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" postHandlingAction="NotifyRethrow" name="Exception" /> </exceptionTypes> </add> </exceptionPolicies> </exceptionHandling> </configuration> i am trying by best to solve this from past 2 days. if any one knw what is the issue here. why my log file is not written in the specified path. and even the details of log are not written in the file path fileName="C:\Inetpub\wwwroot\trace.log"

    Read the article

  • Log - Server kernel: INFO: task httpd:000000 blocked for more than 120 seconds

    - by valter
    Almost everyday my server is crashing due to hight server load, and even restarting apache or mysql can't solve the problem. I need to reboot the server to solve, or it crash again due to the high load. The log system records something like this when it crashes: Aug 11 18:33:53 server kernel: INFO: task httpd:20008 blocked for more than 120 seconds. Aug 11 18:33:53 server kernel: "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. Aug 11 18:33:53 server kernel: httpd D ffffffff801538ac 0 20008 5816 20066 19809 (NOTLB) Aug 11 18:33:53 server kernel: ffff81025a299dc8 0000000000000082 ffff81033b4c0740 ffffffff80009a14 Aug 11 18:33:53 server kernel: ffff8101063f8d80 0000000000000009 ffff8100b758f7e0 ffff8101c57187e0 Aug 11 18:33:53 server kernel: 00009436d4100b6c 000000000001d50f ffff8100b758f9c8 000000083b531588 Aug 11 18:33:53 server kernel: Call Trace: Aug 11 18:33:53 server kernel: [<ffffffff80009a14>] __link_path_walk+0x173/0xfb9 Aug 11 18:33:53 server kernel: [<ffffffff8002cc16>] mntput_no_expire+0x19/0x89 Aug 11 18:33:53 server kernel: [<ffffffff80063c4f>] __mutex_lock_slowpath+0x60/0x9b Aug 11 18:33:53 server kernel: [<ffffffff80023908>] __path_lookup_intent_open+0x56/0x97 Aug 11 18:33:53 server kernel: [<ffffffff80063c99>] .text.lock.mutex+0xf/0x14 Aug 11 18:33:53 server kernel: [<ffffffff8001b21f>] open_namei+0xea/0x712 Aug 11 18:33:54 server kernel: [<ffffffff8002768a>] do_filp_open+0x1c/0x38 Aug 11 18:33:54 server kernel: Firewall: *UDP_IN Blocked* IN=eth1 OUT= MAC=ff:ff:ff:ff:ff:ff:00:30:48:9e:6e:99:08:00 SRC=208.43.135.158 DST=255.255.255.255 LEN=151 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=UDP SPT=38354 DPT=6112 LEN=131 Aug 11 18:33:54 server kernel: [<ffffffff8001a061>] do_sys_open+0x44/0xbe Aug 11 18:33:54 server kernel: [<ffffffff8005d28d>] tracesys+0xd5/0xe0 I googled a lot trying to find a solution. But it looks that the solution is just to update the kernel or disk driver, thinks that I don't know how to do. In this url http://bugs.centos.org/view.php?id=4515 a lot o people report similar problems, except the fact that they are not related to httpd like mine. According to one member, one solution would be to add "elevator=noop " to /etc/grub.conf like in this example: title CentOS (2.6.18-238.12.1.el5xen) root (hd0,0) kernel /vmlinuz-2.6.18-238.12.1.el5xen ro root=/dev/VolGroup00/LogVol00 elevator=noop initrd /initrd-2.6.18-238.12.1.el5xen.img Would this really solve the problem? My disk are working in RAID. Can this cause some problem to my server? Is there any other solution?

    Read the article

  • Unable to locate Windows Server Error log files

    - by Sam007
    I am getting an error in my application on 500 Internal Server Error. The firebug gives me, NetworkError: 500 Internal Server Error - http://webgis.arizona.edu/ArcGIS/rest/services/webGIS/Shock_Models/GPServer/Income_Log/jobs/jc09c501156564f71abc5d98393581267/results/final_shp?dpi=96&transparent=true&format=png8&imageSR=102100&f=image&bbox=%7B%22xmin%22%3A-14519891.438356264%2C%22ymin%22%3A637618.0139790997%2C%22xmax%22%3A-6692739.741956295%2C%22ymax%22%3A6507981.786279075%2C%22spatialReference%22%3A%7B%22wkid%22%3A102100%7D%7D&bboxSR=102100&size=800%2C600 And when I goto that particular link, I get this error, Server Error - Object reference not set to an instance of an object. Any idea how I can correct it? UPDATE I was told to use fiddler and see the error details that is occurring over the network and this is the output that I got, SESSION STATE: Done. Response Entity Size: 849 bytes. == FLAGS ================== BitFlags: [ClientPipeReused, ServerPipeReused] 0x18 X-CLIENTPORT: 2010 X-RESPONSEBODYTRANSFERLENGTH: 849 X-EGRESSPORT: 2023 X-HOSTIP: 128.196.53.161 X-PROCESSINFO: firefox:2248 X-CLIENTIP: 127.0.0.1 X-SERVERSOCKET: REUSE ServerPipe#2 == TIMING INFO ============ ClientConnected: 15:53:51.383 ClientBeginRequest: 15:53:51.494 GotRequestHeaders: 15:53:51.494 ClientDoneRequest: 15:53:51.494 Determine Gateway: 0ms DNS Lookup: 0ms TCP/IP Connect: 0ms HTTPS Handshake: 0ms ServerConnected: 15:52:45.077 FiddlerBeginRequest: 15:53:51.495 ServerGotRequest: 15:53:51.495 ServerBeginResponse: 15:53:51.679 GotResponseHeaders: 15:53:51.679 ServerDoneResponse: 15:53:51.679 ClientBeginResponse: 15:53:51.679 ClientDoneResponse: 15:53:51.679 Overall Elapsed: 00:00:00.1850106 The response was buffered before delivery to the client. == WININET CACHE INFO ============ This URL is not present in the WinINET cache. [Code: 2] * Note: Data above shows WinINET's current cache state, not the state at the time of the request. * Note: Data above shows WinINET's Medium Integrity (non-Protected Mode) cache only. But I am still confused as to what the error is? This is the application. I am not sure if the error is due to the ArcGIS-Server or the Windows Server 2008. I am new on working with the Windows Server and wanted to know where can I look for the error lof files? This is the link which gives the details and the log info of the job executed. This is the output.

    Read the article

  • Inspection of Insert Statement When Using LINQ's SubmitChanges

    - by Code Sherpa
    Hi. I want to see what my insert statement would look like as if I was wiring up an text-based ADO.NET command. How do I do this? I have been following the below link: http://damieng.com/blog/2008/07/30/linq-to-sql-log-to-debug-window-file-memory-or-multiple-writers And have added the DebugTextWriter class to my project. So, now, in my code I have the following which doesn't really do anything and I don't think its right: using(WorkbookDataContext dc = _conn.GetContext()) { if(profile.ProfileId > 0) { dc.Profiles.Attach(profile, true); } else { dc.Profiles.InsertOnSubmit(profile); } dc.Log = new DebugTextWriter(); #if DEBUG dc.Log = new System.IO.StreamWriter("linq-to-sql.log") { AutoFlush = true }; #endif dc.SubmitChanges(); } Any ideas what I am doing wrong and/or how to inspect my LINQ insert statement correctly? Thanks

    Read the article

  • Compress Large Video Files with DivX / Xvid and AutoGK

    - by DigitalGeekery
    Have you ever recorded home video on a camcorder only to find the video size is enormous? What if you wanted to share a video clip on YouTube or another video sharing site, but the file size was bigger than the maximum upload size? Today we’ll look at a way to compress certain video files, such as MPEG and AVI, with Auto Gordian Knot (AutoGK). AutoGK is a free application that runs on Windows. It supports Mpeg1, Mpeg2, Transport Streams, Vobs, and virtually any codec used for an .AVI file. AutoGK will accept as input the following file types: MPG, MPEG, VOB, VRO, M2V, DAT, IFO, TS, TP, TRP, M2T, and AVI. Files are output as .AVI files and are converted using the DivX or XviD codecs. Installing and Using AutoGK Download and install AutoGK (link below) Open the AutoGK. You’ll need to navigate a few wizard screens, but you can just accept the defaults.   Choose your video file by clicking on the folder to the right of the Input file text box.   Browse for and select your video file and click “Open.”   For this example, we’ll be working with an .AVI file that’s 167MB in size.   The output file is copied into the same directory as the input file by default, but you can change this if you choose. If the input file is also .AVI, AutoGK will append an _agk to the output file so that the original is not overwritten. Next, you’ll see any audio tracks listed. You can unselect the check box if you’d like to remove the audio track. You can choose one of the Predefined size options… Or, select a Custom size in MB or Target Quality in percentage. For our example, we’ll be compressing our 167MB file to 35MB. Click on Advanced Settings. Here you can choose your codec, if you have a preference, as well as output resolution and output audio. If you’d like to use the DivX codec, you’ll need to download and install it separately. (See link below) Typically you’ll want to keep the defaults. Click “OK.” Now you’re ready to add your file conversion job to the Job queue. Click Add Job to add it to the queue. You can add multiple files conversions to the job queue and  convert them in one batch. Click Start to begin the conversion process. The process will begin. You’ll be able to see the progress in the Log window on the bottom left. When the conversion is complete you’ll see a “Job finished” and the total time in the log window.   Check your output file to see it’s compressed size. Test your video just to make sure the output quality is satisfactory.   Note:  Conversion times can vary greatly depending on the size of the file and your computer hardware. Files that are several GBs in size may take several hours to compress. AutoGK is no longer being actively developed but is still a wonderful DivX/XviD conversion tool. It can also be used to compress and convert non-copy protected DVDs. Downloads AutoGordianKnot DivX (optional) Similar Articles Productive Geek Tips Use Your Mac Mini as a Media Server Part 2Make Disk Cleanup Compress Older(or Newer) Files on XPMysticgeek Blog: Exclusive Look Inside Vreel – Including Interview With Vreel Founder!Friday Fun: Watch HD Video Content with MeevidConvert a DVD Movie Directly to AVI with FairUse Wizard 2.9 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Penolo Lets You Share Sketches On Twitter Visit Woolyss.com for Old School Games, Music and Videos Add a Custom Title in IE using Spybot or Spyware Blaster When You Need to Hail a Taxi in NYC Live Map of Marine Traffic NoSquint Remembers Site Specific Zoom Levels (Firefox)

    Read the article

  • help! corrupt file recovery

    - by TheBumpper
    My supervisor computer crashed last night, and I'm trying to help him out. He made an R script but when he tried to open it, it was empty. But for some reason the file is 7.9kb so it should not be empty i think... anyway when i tried to open it, Gedit gave this error: "The file you opened has some invalid characters. If you continue editing this file you could corrupt this document. You can also choose another character encoding and try again." and the options to encode the characters. It looked like this(with a red background): \00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\ My question is, is there a way to restore the file? i hope someone has a brilliant idea

    Read the article

  • Al abrir archivo desde navegador se abre el directorio

    - by user67662
    al descargar un archivo a través de cualquier navegador (chrome, firefox, etc) e intentar abrirlo directamente, en vez de abrirse el archivo se abre el directorio en que se descargó. lo mismo me sucedió al intentar abrir un archivo desde el dash de gnome-shell. Esto sólo me sucede con los accesos directos a los archivos, cuando estoy dentro de nautilus se abre el archivo sin problemas. he intentado en distintos entornos de escritorio, el que uso más constantemente es Gnome-Shell, bajo Ubuntu 12.04 ¿cómo lo puedo solucionar? Gracias!

    Read the article

  • SQL SERVER – A Quick Look at Logging and Ideas around Logging

    - by pinaldave
    This blog post is written in response to the T-SQL Tuesday post on Logging. When someone talks about logging, personally I get lots of ideas about it. I have seen logging as a very generic term. Let me ask you this question first before I continue writing about logging. What is the first thing comes to your mind when you hear word “Logging”? Now ask the same question to the guy standing next to you. I am pretty confident that you will get  a different answer from different people. I decided to do this activity and asked 5 SQL Server person the same question. Question: What is the first thing comes to your mind when you hear the word “Logging”? Strange enough I got a different answer every single time. Let me just list what answer I got from my friends. Let us go over them one by one. Output Clause The very first person replied output clause. Pretty interesting answer to start with. I see what exactly he was thinking. SQL Server 2005 has introduced a new OUTPUT clause. OUTPUT clause has access to inserted and deleted tables (virtual tables) just like triggers. OUTPUT clause can be used to return values to client clause. OUTPUT clause can be used with INSERT, UPDATE, or DELETE to identify the actual rows affected by these statements. Here are some references for Output Clause: OUTPUT Clause Example and Explanation with INSERT, UPDATE, DELETE Reasons for Using Output Clause – Quiz Tips from the SQL Joes 2 Pros Development Series – Output Clause in Simple Examples Error Logs I was expecting someone to mention Error logs when it is about logging. The error log is the most looked place when there is any error either with the application or there is an error with the operating system. I have kept the policy to check my server’s error log every day. The reason is simple – enough time in my career I have figured out that when I am looking at error logs I find something which I was not expecting. There are cases, when I noticed errors in the error log and I fixed them before end user notices it. Other common practices I always tell my DBA friends to do is that when any error happens they should find relevant entries in the error logs and document the same. It is quite possible that they will see the same error in the error log  and able to fix the error based on the knowledge base which they have created. There can be many different kinds of error log files exists in SQL Server as well – 1) SQL Server Error Logs 2) Windows Event Log 3) SQL Server Agent Log 4) SQL Server Profile Log 5) SQL Server Setup Log etc. Here are some references for Error Logs: Recycle Error Log – Create New Log file without Server Restart SQL Error Messages Change Data Capture I got surprised with this answer. I think more than the answer I was surprised by the person who had answered me this one. I always thought he was expert in HTML, JavaScript but I guess, one should never assume about others. Indeed one of the cool logging feature is Change Data Capture. Change Data Capture records INSERTs, UPDATEs, and DELETEs applied to SQL Server tables, and makes a record available of what changed, where, and when, in simple relational ‘change tables’ rather than in an esoteric chopped salad of XML. These change tables contain columns that reflect the column structure of the source table you have chosen to track, along with the metadata needed to understand the changes that have been made. Here are some references for Change Data Capture: Introduction to Change Data Capture (CDC) in SQL Server 2008 Tuning the Performance of Change Data Capture in SQL Server 2008 Download Script of Change Data Capture (CDC) CDC and TRUNCATE – Cannot truncate table because it is published for replication or enabled for Change Data Capture Dynamic Management View (DMV) I like this answer. If asked I would have not come up with DMV right away but in the spirit of the original question, I think DMV does log the data. DMV logs or stores or records the various data and activity on the SQL Server. Dynamic management views return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. One can get plethero of information from DMVs – High Availability Status, Query Executions Details, SQL Server Resources Status etc. Here are some references for Dynamic Management View (DMV): SQL SERVER – Denali – DMV Enhancement – sys.dm_exec_query_stats – New Columns DMV – sys.dm_os_windows_info – Information about Operating System DMV – sys.dm_os_wait_stats Explanation – Wait Type – Day 3 of 28 DMV sys.dm_exec_describe_first_result_set_for_object – Describes the First Result Metadata for the Module Transaction Log Impact Detection Using DMV – dm_tran_database_transactions Log Files I almost flipped with this final answer from my friend. This should be probably the first answer. Yes, indeed log file logs the SQL Server activities. One can write infinite things about log file. SQL Server uses log file with the extension .ldf to manage transactions and maintain database integrity. Log file ensures that valid data is written out to database and system is in a consistent state. Log files are extremely useful in case of the database failures as with the help of full backup file database can be brought in the desired state (point in time recovery is also possible). SQL Server database has three recovery models – 1) Simple, 2) Full and 3) Bulk Logged. Each of the model uses the .ldf file for performing various activities. It is very important to take the backup of the log files (along with full backup) as one never knows when backup of the log file come into the action and save the day! How to Stop Growing Log File Too Big Reduce the Virtual Log Files (VLFs) from LDF file Log File Growing for Model Database – model Database Log File Grew Too Big master Database Log File Grew Too Big SHRINKFILE and TRUNCATE Log File in SQL Server 2008 Can I just say I loved this month’s T-SQL Tuesday Question. It really provoked very interesting conversation around me. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Optimization, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • How to Fix "Read-only file system" error when I run something as sudo and try to make a folder/file?

    - by Andrew
    When I try to save something or rename a file/folder it say this error " Read-only file system" or run something as root in the terminal it say this error sudo: unable to open /var/lib/sudo/"My User Name"/0: Read-only file system W: Not using locking for read only lock file /var/lib/dpkg/lock E: Unable to write to /var/cache/apt/ E: The package lists or status file could not be parsed or opened. When I make a Folder the error dialog in the details with Nautilus is this: Error creating directory: Read-only file system I would show you I picture of it but it isn't even letting my save onto my flash drive. Please help me.

    Read the article

  • Can't find Localhost files

    - by GMF
    Hope you can help. This is my first time trying Ubuntu/Linux. I am logged in as root I have downloaded and installed LAMP and PHPMYADMIN. I get the test page under localhost say that It works and is installed Correctly. I have also put my files in the /var/www. they are PHP files When I put the address localhost/(page name.php) I get an error saying Not Found The requested URL /index.php was not found on this server. Apache/2.2.22 (Ubuntu) Server at localhost Port 80 I Have put the files in the wrong folder?? If I look in the "/etc/ap[ache2/sites-available/default", It tells my my DocumentRoot is /var/www Would love somehelp on this please Many thanks GF

    Read the article

  • I am receiving a message saying I have duplicate sources but I can't seem to find a duplicate of the line described, any ideas?

    - by David Griffiths
    I receive this meassage when I run sudo apt-get update in the terminal:- Duplicate sources.list entry http://archive.canonical.com/ubuntu/ precise/partner i386 Packages (/var/lib/apt/lists/archive.canonical.com_ubuntu_dists_precise_partner_binary-i386_Packages) So i ran the command gksu gedit /etc/apt/sources.list and checked the source to find there was no duplicate, not that I can see anyway. Here is the source:- # deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release i386 (20120423)]/ precise main restricted deb-src http://archive.ubuntu.com/ubuntu precise main restricted #Added by software-properties # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://gb.archive.ubuntu.com/ubuntu/ precise main restricted deb-src http://gb.archive.ubuntu.com/ubuntu/ precise restricted main multiverse universe #Added by software-properties ## Major bug fix updates produced after the final release of the ## distribution. deb http://gb.archive.ubuntu.com/ubuntu/ precise-updates main restricted deb-src http://gb.archive.ubuntu.com/ubuntu/ precise-updates restricted main multiverse universe #Added by software-properties ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://gb.archive.ubuntu.com/ubuntu/ precise universe deb http://gb.archive.ubuntu.com/ubuntu/ precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://gb.archive.ubuntu.com/ubuntu/ precise multiverse deb http://gb.archive.ubuntu.com/ubuntu/ precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://gb.archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse #Added by software-properties deb http://security.ubuntu.com/ubuntu precise-security main restricted deb-src http://security.ubuntu.com/ubuntu precise-security restricted main multiverse universe #Added by software-properties deb http://security.ubuntu.com/ubuntu precise-security universe deb http://security.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu precise partner ## Uncomment the following two lines to add software from Ubuntu's ## 'extras' repository. ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. # deb http://extras.ubuntu.com/ubuntu precise main # deb-src http://extras.ubuntu.com/ubuntu precise main deb http://repository.spotify.com stable non-free I can see there are two lines of deb http://archive.canonical.com/ubuntu precise partner but one has #deb-src at the beginning of it. Hashed out no? I'm quite new to linux OS and have little to none sourced editing skills so any help would be most appreciated. Thank you:)

    Read the article

  • How to make bash quit tab autocompleting hidden directories

    - by Kristopher Micinski
    Most of the time, I don't need autocompletes for my hidden directories. In fact, that's the point of them being hidden! However, annoyingly, bash takes these directories into account when considering tab autocompletion. This is particularly annoying when I have the following scenario: a .svn foler along with a single folder that I want to traverse into by simply pushing tab. (This typically comes up with deep Java packages...) Is there any way to change the default behavior? Worst case scenario I have to type '.' before tab, which seems like a no brainer for my usability.

    Read the article

  • How to convert unconvertable & unviewable .ts files?

    - by Evelin Versh
    How to convert .ts files that can not be converted with usual WinFF, Avidemux etc programs? The .ts files in question are recorded from TV with STV digital cable digibox, viewable to me so far ONLY with that same digibox. All the video-playing programs i tried do not open the files at all (e.g. classical VLC and WinMedia player). All but 1 video converters i tried also are not able even to open or load the file into the program, therefore no conversion is possible. According to WinFF it can not find codec parameters during the conversion, evidently leading to nothing-happening???! HELP!, please.

    Read the article

  • Moving screenshots from 1 folder to another instantly

    - by Frank
    I am hosting a gaming site and once in a while the server automatically creates a screenshot in the servers folder. Unfortunatly it is NOT configurable in the server settings where it puts these screenshots, it just allways dumps the PNG file in the server config folder. My folder structure is for example as follows: /home/Game/Server1/ now, what I would like to achieve is that once the server creates a screenshot in 1 of these server folders (I have multiple), the operating system moves the screenshot IMMEDIATLY (which is ALLWAYS a *.png file) to the webserver folder, for example: /var/www/Server1/filename.png So that players can see the screenshot on the website. Anyone any idea on how I can tackle this problem the smartest way? Please note that my ideal situation would be if the PNG file is moved immediatly after creation. Thanks for your help. Frank

    Read the article

  • How to mount nexus 7 file system to view files from command line

    - by knotech
    Just got a Nexus 7, switched MTP on and plugged it in. It pops right up in nautilus, is titled Nexus 7, and lets me browse all of the files. However, on the command line, it's simply listed as /media/usbdrive and when I try to list the files, nothing comes up. Do I have to manually mount it in order to navigate the files from the command line? And why is it that nautilus recognizes it, and has access immediately, but I can't get at it from the command line?

    Read the article

  • cifs mounted NAS drive treats subdirectories as files

    - by Mike Krejci
    I have a NAS drive mounted in my fstab as follows: //192.168.0.182/Videos /mnt/NAS_02 cifs iocharset=utf8,uid=65534,gid=65534,guest,rw 0 0 It mounts fine, and in the terminal using cd and ls you can view all the files and navigate all the directories. However, in any other program all subdirectories in the mnt/NAS drive are treated as files and I can't open them up. A tree like this for example: /Movies /Comedy /Drama Movie.mp4 I can enter the directory /Movies, but the directories /Movies/Comedy and /Movies/Drama show up as files and I can't enter them. The permissions are all fine. I just upgraded UBUNTU and was using smbfs on the same system before and it worked fine. It's under cifs that it no longer works. Any thoughts, I can't seem to find any solutions.

    Read the article

  • How to hide (in Thunar and Nautilus) a directory without putting a dot in its name?

    - by Ivan
    Usually Linux programs store user's settings in ~/.* directories. But unfortunately some developers (of some applications I need) do not follow this rule and don't start their settings storage folders names with a dot. This results in never-user-used folders cluttering (not the right word perhaps, as there are not many, but they annoy anyway) a home directory. Renaming them is not an option, as the applications won't find them in this case (and will create them again). Is there a way to hide a folder having no dot starting its name from being displayed in common file system browsers (I actually use Thunar of XFCE, alongside with Midnight Commander and Krusader, but wouldn't mind to know about Nautilus too).

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >