Search Results

Search found 81725 results on 3269 pages for 'input file'.

Page 18/3269 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Upgrading MySQL causes errors

    - by user1032531
    I am running Centos 5.8 (Linux 2.6.18-308.13.1.el5 on x86_64), and wish to upgrade MySQL version 5.0.95. I am using a tutorial found at http://www.if-not-true-then-false.com/2010/install-mysql-on-fedora-centos-red-hat-rhel/. When I get to the following, I get the host of errors. Seems to have something to do with going from 386 to 86x64. Can anyone help? Thanks yum --enablerepo=remi,remi-test update mysql mysql-server Transaction Check Error: file /etc/my.cnf from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/info/mysql.info.gz from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysql.1.gz from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqlaccess.1.gz from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqladmin.1.gz from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqldump.1.gz from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqlshow.1.gz from install of mysql50-5.0.96-2.ius.el5.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /etc/my.cnf from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/charsets/Index.xml from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/charsets/cp1250.xml from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/charsets/cp1251.xml from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/czech/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/danish/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/dutch/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/english/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/estonian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/french/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/german/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/greek/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/hungarian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/italian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/japanese/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/korean/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/norwegian-ny/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/norwegian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/polish/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/portuguese/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/romanian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/russian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/serbian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/slovak/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/spanish/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/swedish/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/mysql/ukrainian/errmsg.sys from install of mysql-libs-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/bin/mysqlaccess from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/my_print_defaults.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysql.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysql_config.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysql_find_rows.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysql_waitpid.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqlaccess.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqladmin.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqldump.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386 file /usr/share/man/man1/mysqlshow.1.gz from install of mysql-5.5.28-1.el5.remi.x86_64 conflicts with file from package mysql-5.0.95-1.el5_7.1.i386

    Read the article

  • System.IO.IOException: file used by another process

    - by Srodriguez
    Dear all, I've been working in this small piece of code that seems trivial but still i cannot really see where is the problem. My functions does a pretty simple thing. Opens a file, copy its contents, replace a string inside and copy it back to the original file (a simple search and replace inside a text file then). I didn't really know how to do that as I'm adding lines to the original file, so i just create a copy of the file, (file.temp) copy also a backup (file.temp) then delete the original file(file) and copy the file.temp to file. I get an exception while doing the delete of the file. Here is the sample code: private static bool modifyFile(FileInfo file, string extractedMethod, string modifiedMethod) { Boolean result = false; FileStream fs = new FileStream(file.FullName + ".tmp", FileMode.Create, FileAccess.Write); StreamWriter sw = new StreamWriter(fs); StreamReader streamreader = file.OpenText(); String originalPath = file.FullName; string input = streamreader.ReadToEnd(); Console.WriteLine("input : {0}", input); String tempString = input.Replace(extractedMethod, modifiedMethod); Console.WriteLine("replaced String {0}", tempString); try { sw.Write(tempString); sw.Flush(); sw.Close(); sw.Dispose(); fs.Close(); fs.Dispose(); streamreader.Close(); streamreader.Dispose(); File.Copy(originalPath, originalPath + ".old", true); FileInfo newFile = new FileInfo(originalPath + ".tmp"); File.Delete(originalPath); File.Copy(fs., originalPath, true); result = true; } catch (Exception ex) { Console.WriteLine(ex); } return result; }` And the related exception System.IO.IOException: The process cannot access the file 'E:\mypath\myFile.cs' because it is being used by another process. at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) at System.IO.File.Delete(String path) at callingMethod.modifyFile(FileInfo file, String extractedMethod, String modifiedMethod) Normally these errors come from unclosed file streams, but I've taken care of that. I guess I've forgotten an important step but cannot figure out where. Thank you very much for your help,

    Read the article

  • Windows Server 2003 R2 Standard: Locks MS Office files, but not Adobe .AI and .PSD files?

    - by Bruce Garlock
    We have some shares setup on a Windows 2003 R2 server, and the MS Office files people save behave properly: The first person to open the file gets read/write, and the second person to open the file while the first person still has the file open, gets a read-only version. This is not true for the graphics files, like Adobe Illustrator .AI files, and Photoshop .PSD files. Anyone who goes to open these files has full read/write, even if someone else is already working on the file! This has lead to numerous file corruption issues, as well as other lost work, since it always saves the last changes to the file. How do we get Windows to properly lock these files so when someone is working on a file, and someone else wants to open one, they get read-only access? Many thanks, Bruce

    Read the article

  • File locked / read-only

    - by oshirowanen
    On a networked computer, I have a file which is coming up as read-only because someone else has it open. This is not true. This is a file stored locally on the computer and it is not being used by anyone else. I can login to the same computer using a different user, and the file opens up fine. I just get the issue with a particular user account. Other than deleting theses account/profile and creating it again, how can I unlock this file? Double clicking on this file gives me a message saying The file is locked for editing by another user, or the file (or the folder in which it is located,) is marked as read-only, or you specified that you wanted to open this file read-only. I don't think the folder is locked, because I can use other files in that folder fine, it's just 1 particular file which is causing this issue. I know that only 1 user is using this file as the file is on his c: and the same file works fine if he logs off to allow another user to log in.

    Read the article

  • Fragmented Log files could be slowing down your database

    - by Fatherjack
    Something that is sometimes forgotten by a lot of DBAs is the fact that database log files get fragmented in the same way that you get fragmentation in a data file. The cause is very different but the effect is the same – too much effort reading and writing data. Data files get fragmented as data is changed through normal system activity, INSERTs, UPDATEs and DELETEs cause fragmentation and most experienced DBAs are monitoring their indexes for fragmentation and dealing with it accordingly. However, you don’t hear about so many working on their log files. How can a log file get fragmented? I’m glad you asked. When you create a database there are at least two files created on the disk storage; an mdf for the data and an ldf for the log file (you can also have ndf files for extra data storage but that’s off topic for now). It is wholly possible to have more than one log file but in most cases there is little point in creating more than one as the log file is written to in a ‘wrap-around’ method (more on that later). When a log file is created at the time that a database is created the file is actually sub divided into a number of virtual log files (VLFs). The number and size of these VLFs depends on the size chosen for the log file. VLFs are also created in the space added to a log file when a log file growth event takes place. Do you have your log files set to auto grow? Then you have potentially been introducing many VLFs into your log file. Let’s get to see how many VLFs we have in a brand new database. USE master GO CREATE DATABASE VLF_Test ON ( NAME = VLF_Test, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test.mdf', SIZE = 100, MAXSIZE = 500, FILEGROWTH = 50 ) LOG ON ( NAME = VLF_Test_Log, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test_log.ldf', SIZE = 5MB, MAXSIZE = 250MB, FILEGROWTH = 5MB ); go USE VLF_Test go DBCC LOGINFO; The results of this are firstly a new database is created with specified files sizes and the the DBCC LOGINFO results are returned to the script editor. The DBCC LOGINFO results have plenty of interesting information in them but lets first note there are 4 rows of information, this relates to the fact that 4 VLFs have been created in the log file. The values in the FileSize column are the sizes of each VLF in bytes, you will see that the last one to be created is slightly larger than the others. So, a 5MB log file has 4 VLFs of roughly 1.25 MB. Lets alter the CREATE DATABASE script to create a log file that’s a bit bigger and see what happens. Alter the code above so that the log file details are replaced by LOG ON ( NAME = VLF_Test_Log, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test_log.ldf', SIZE = 1GB, MAXSIZE = 25GB, FILEGROWTH = 1GB ); With a bigger log file specified we get more VLFs What if we make it bigger again? LOG ON ( NAME = VLF_Test_Log, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test_log.ldf', SIZE = 5GB, MAXSIZE = 250GB, FILEGROWTH = 5GB ); This time we see more VLFs are created within our log file. We now have our 5GB log file comprised of 16 files of 320MB each. In fact these sizes fall into all the ranges that control the VLF creation criteria – what a coincidence! The rules that are followed when a log file is created or has it’s size increased are pretty basic. If the file growth is lower than 64MB then 4 VLFs are created If the growth is between 64MB and 1GB then 8 VLFs are created If the growth is greater than 1GB then 16 VLFs are created. Now the potential for chaos comes if the default values and settings for log file growth are used. By default a database log file gets a 1MB log file with unlimited growth in steps of 10%. The database we just created is 6 MB, let’s add some data and see what happens. USE vlf_test go -- we need somewhere to put the data so, a table is in order IF OBJECT_ID('A_Table') IS NOT NULL DROP TABLE A_Table go CREATE TABLE A_Table ( Col_A int IDENTITY, Col_B CHAR(8000) ) GO -- Let's check the state of the log file -- 4 VLFs found EXECUTE ('DBCC LOGINFO'); go -- We can go ahead and insert some data and then check the state of the log file again INSERT A_Table (col_b) SELECT TOP 500 REPLICATE('a',2000) FROM sys.columns AS sc, sys.columns AS sc2 GO -- insert 500 rows and we get 22 VLFs EXECUTE ('DBCC LOGINFO'); go -- Let's insert more rows INSERT A_Table (col_b) SELECT TOP 2000 REPLICATE('a',2000) FROM sys.columns AS sc, sys.columns AS sc2 GO 10 -- insert 2000 rows, in 10 batches and we suddenly have 107 VLFs EXECUTE ('DBCC LOGINFO'); Well, that escalated quickly! Our log file is split, internally, into 107 fragments after a few thousand inserts. The same happens with any logged transactions, I just chose to illustrate this with INSERTs. Having too many VLFs can cause performance degradation at times of database start up, log backup and log restore operations so it’s well worth keeping a check on this property. How do we prevent excessive VLF creation? Creating the database with larger files and also with larger growth steps and actively choosing to grow your databases rather than leaving it to the Auto Grow event can make sure that the growths are made with a size that is optimal. How do we resolve a situation of a database with too many VLFs? This process needs to be done when the database is under little or no stress so that you don’t affect system users. The steps are: BACKUP LOG YourDBName TO YourBackupDestinationOfChoice Shrink the log file to its smallest possible size DBCC SHRINKFILE(FileNameOfTLogHere, TRUNCATEONLY) * Re-size the log file to the size you want it to, taking in to account your expected needs for the coming months or year. ALTER DATABASE YourDBName MODIFY FILE ( NAME = FileNameOfTLogHere, SIZE = TheSizeYouWantItToBeIn_MB) * – If you don’t know the file name of your log file then run sp_helpfile while you are connected to the database that you want to work on and you will get the details you need. The resize step can take quite a while This is already detailed far better than I can explain it by Kimberley Tripp in her blog 8-Steps-to-better-Transaction-Log-throughput.aspx. The result of this will be a log file with a VLF count according to the bullet list above. Knowing when VLFs are being created By complete coincidence while I have been writing this blog (it’s been quite some time from it’s inception to going live) Jonathan Kehayias from SQLSkills.com has written a great article on how to track database file growth using Event Notifications and Service Broker. I strongly recommend taking a look at it as this is going to catch any sneaky auto grows that take place and let you know about them right away. Hassle free monitoring of VLFs If you are lucky or wise enough to be using SQL Monitor or another monitoring tool that let’s you write your own custom metrics then you can keep an eye on this very easily. There is a custom metric for VLFs (written by Stuart Ainsworth) already on the site and there are some others there are very useful so take a moment or two to look around while you are there. Resources MSDN – http://msdn.microsoft.com/en-us/library/ms179355(v=sql.105).aspx Kimberly Tripp from SQLSkills.com – http://www.sqlskills.com/BLOGS/KIMBERLY/post/8-Steps-to-better-Transaction-Log-throughput.aspx Thomas LaRock at Simple-Talk.com – http://www.simple-talk.com/sql/database-administration/monitoring-sql-server-virtual-log-file-fragmentation/ Disclosure I am a Friend of Red Gate. This means that I am more than likely to say good things about Red Gate DBA and Developer tools. No matter how awesome I make them sound, take the time to compare them with other products before you contact the Red Gate sales team to make your order.

    Read the article

  • How to disable the all Fieldset Input, textarea, select except input checkbox

    - by kumar
    Hello friends, I have this code in my view this code will execute each users based on selection if I select 3 users this code will execute three times and displays three seperates in same page with differnt users.. Here is my code... I used this code.. this is disabling only frist user Fieldset not for all other fieldsets? and in this Fieldset Except Input check box I need to disabled all other Inpu,select,textarea.. need to disabled.. Please can any one help me I need to hide all the Fieldsets.. thanks

    Read the article

  • How to provide Input to Dialogs designed by Qt Designer

    - by GG
    Hello, I am a Qt beginner and working with Qt Designer to develop some small UI elements. I read http://doc.trolltech.com/4.5/designer-using-a-ui-file.html to use these GUI elements in my code and using multiple inheritance approach. I am introducing bookmark feature which somewhat look like http://img293.imageshack.us/img293/3041/screenshotyb.png. Now the problem I am facing is How can I show all existing bookmark folders in the drop down(say folders are in a QVector). So my main problem is how can I pass some inputs to the UI element. I think I'm clear, please let me know if further explanation is required. Sorry for adding links directly, rich formatting in my browser is not working. Thanks in advance,

    Read the article

  • Android Reading from an Input stream efficiently

    - by RenegadeAndy
    Hey, I am making an HTTP get request to a website for an android application I am making. I am using a DefaultHttpClient and using HttpGet to issue the request. I get the entity response and from this obtain an InputStream object for getting the html of the page. I then cycle through the reply doing as follows: BufferedReader r = new BufferedReader(new InputStreamReader(inputStream)); String x = ""; x = r.readLine(); String total = ""; while(x!= null){ total += x; x = r.readLine(); } However this is horrendously slow. Is this inefficient? I'm not loading a big web page - www.cokezone.co.uk so the file size is not big. Is there a better way to do this? Thanks Andy

    Read the article

  • Best way to set a default button (or trigger its event in javascript) for an input field, not part of a form

    - by Sheldon Pinkman
    I've got a stand-alone input field, not part of any form. I also got a button, that has some onclick event. When I type something in the input field, and press the Enter key, I want it do effectively press the button, or trigger its onclick event. So that the button is "the input field's default button" so to speak. <input id='myText' type='text' /> <button id='myButton' onclick='DoSomething()'>Do it!</button> I guess I can mess around with the input field's onkeypress or onkeydown events and check for the Enter key, etc. But is there a more 'clean' way, I mean something that associated the button with that input field, so that the button is the 'default action' or something for that field? Note that I'm not inside a form, I am not sending, posting, or submitting something. The DoSomething() function just changes some of the HTML content locally, depending on the text input.

    Read the article

  • Intermittent bug - IE6 showing file as text in browser, rather than as file download

    - by Richard Ev
    In an ASP.NET WebForms 2.0 site we are encountering an intermittent bug in IE6 whereby a file download attempt results in the contents of the being shown directly in the browser as text, rather than the file save dialog being displayed. Our application allows the user to download both PDF and CSV files. The code we're using is: HttpResponse response = HttpContext.Current.Response; response.Clear(); response.AddHeader("Content-Disposition", "attachment;filename=\"theFilename.pdf\""); response.ContentType = "application/pdf"; response.BinaryWrite(MethodThatReturnsFileContents()); response.End(); This is called from the code-behind click event handler of a button server control. Where are we going wrong with this approach? Edit Following James' answer to this posting, the code I'm using now looks like this: HttpResponse response = HttpContext.Current.Response; response.ClearHeaders(); // Setting cache to NoCache was recommended, but doing so results in a security // warning in IE6 //response.Cache.SetCacheability(HttpCacheability.NoCache); response.AppendHeader("Content-Disposition", "attachment; filename=\"theFilename.pdf\""); response.ContentType = "application/pdf"; response.BinaryWrite(MethodThatReturnsFileContents()); response.Flush(); response.End(); However, I don't believe that any of the changes made will fix the issue.

    Read the article

  • Improving File Read Performance (single file, C++, Windows)

    - by david
    I have large (hundreds of MB or more) files that I need to read blocks from using C++ on Windows. Currently the relevant functions are: errorType LargeFile::read( void* data_out, __int64 start_position, __int64 size_bytes ) const { if( !m_open ) { // return error } else { seekPosition( start_position ); DWORD bytes_read; BOOL result = ReadFile( m_file, data_out, DWORD( size_bytes ), &bytes_read, NULL ); if( size_bytes != bytes_read || result != TRUE ) { // return error } } // return no error } void LargeFile::seekPosition( __int64 position ) const { LARGE_INTEGER target; target.QuadPart = LONGLONG( position ); SetFilePointerEx( m_file, target, NULL, FILE_BEGIN ); } The performance of the above does not seem to be very good. Reads are on 4K blocks of the file. Some reads are coherent, most are not. A couple questions: Is there a good way to profile the reads? What things might improve the performance? For example, would sector-aligning the data be useful? I'm relatively new to file i/o optimization, so suggestions or pointers to articles/tutorials would be helpful.

    Read the article

  • partially downloaded a torrent file and renamed it

    - by user2613789
    I have partially downloaded a torrent file in ubuntu and unfortunately i renamed it. and after some time i resumed the remaining torrent file.but the torrent then got downloaded with its default name and i lost the complete file information into two different files... please help me out...is there any way that i can produce the whole information into one file...i can't open either file that means both file got corrupted or so...the file is in mp4 format...please help me

    Read the article

  • Which language is more suitable heavy file tasks?

    - by All
    I need to write a script (based on basic functions) to process /image/audio/video files. The process is mainly filesystem tasks and converts. The database of files has been stored by mysql. The script is simple but cause heavy tasks on the system; for example renaming/converting/copying thousands of file in a run. The script does not read the content of files into memory, it just manage the commands for sub-processes. The main weight is on the communication with filesystem. The script will be used regularly for new files. My concern is about performance. I am thinking of Shell script a complied language like C Please advise which programming language is more suitable for this purpose and why? UPDATE: An example is to scan a folder for images, convert them with ImageMagick, move files to destination folder, get file info, then update the database. As you can see, the process has no room for optimization, and most of languages have similar APIs for popular programs like ImageMagick, MySQL, etc. Thus, it can be written in any language. I just wish to reduce resource usage by speeding up the long loop. NOTE: I know that questions about comparing languages are not favorable, but I really had problem to choose, because the problems can appear in action.

    Read the article

  • Associating File Types with AutoVue Desktop Deployment

    - by [email protected]
    Windows users take for granted that when they double click on a document or design, that it will open up in its application automatically. One of the questions I'm commonly asked is "How can I get the same behavior with AutoVue Desktop Deployment?". It's pretty easy, but there are a few tricks to doing it. Step 1: Download new jvue_direct.bat and icon The first thing you'll need to do is download a slightly modified version of jvue_direct.bat. You can find it here (Document 1075784.1) on Oracle's Support Portal. You also want to download the AV.ico file. This is the icon that will be used for all file types associated with AutoVue. Place both of these files in your <AutoVueInstallDirectory>\bin directory. Step 2: Associate File Types With AutoVue There are two ways to do this. You can do this through the Windows user interface, or you can set up a batch file to do this. Associating File Types Through Windows The way most people associate file types to an application is using the Windows user interface. You've probably tried to open a file type that Windows doesn't recognize and seen this window pop up: Although you can use this dialog to associate that file type with AutoVue, I don't recommend it. I much prefer using a batch file to associate file types with AutoVue. Associating File Types Using A Batch File There are a few good reasons to associate file types using a batch file instead of using the pop-up dialog method: If you have several file types to associate with AutoVue, it's much easier to use a batch file to do them all at once. Doing it through the Windows user interface requires having files of each type available. Using a batch file doesn't require having the files you're associating. Associating file types through the dialog may work well for one person, but what if you're an administrator doing an enterprise wide deployment of AutoVue Desktop Deployment for several hundred users? You don't want to do this manually for each user. You can have one simple batch file that's run on each user's PC to set up all the file types. You can easily associate an icon with the file types you're opening with AutoVue. To use the batch file method follow these steps: Create a file called filetype.bat using a text editor and copy and paste the following into it: @assoc .dwg=AVFile @assoc .jpg=AVFile @assoc .doc=AVFile @ftype AVFile="%~dp0jvue_direct.bat" "%%1" @reg add HKEY_CLASSES_ROOT\AVFile\DefaultIcon /v "" /f /d "%~dp0AV.ico" Change the lines starting with @assoc. Each of these lines associates a file extension with AutoVue. You can have as many @assoc lines as you want. Save this file in your <AutoVueInstallDirectory>\bin directory. Double click this file, or run it from a command prompt. Restart Windows to get the icons to show up. How Does This Work? The first three lines are creating a file type called AVFile. We are associating the extensions .dwg, .jpg, and .doc with this file type. You will want to change these lines when creating your own batch file. For example, to associate Microstation designs, which have extension .dgn, you should delete the @assoc lines above and add the line: @assoc .dgn=AVfile The line beginning with @ftype tells Windows that all AVFile type files should be opened using AutoVue Desktop Deployment. The final line associates the AutoVue icon with these file types. You may need to restart Windows to see the new icons. Warning: One Size Doesn't Fit All When deciding which file types should be associated with AutoVue, remember that there are different types of users using it. Your engineers may be pretty surprised to find that after installing AutoVue, double clicking their .dwg file opens up AutoVue instead of AutoCAD. If you have more than one type of AutoVue user, make sure you've considered what file types each user group will and will not want to be associated with AutoVue. If necessary, create a separate file association batch file for each user type. So that's it. In two simple steps you can double click your favorite designs and have them open automatically in AutoVue Desktop Deployment. I'd love to hear how are you using AutoVue Desktop Deployment. What other deployment tips would you be interested in learning about?

    Read the article

  • Working with multiple input and output files in Python

    - by Morlock
    I need to open multiple files (2 input and 2 output files), do complex manipulations on the lines from input files and then append results at the end of 2 output files. I am currently using the following approach: in_1 = open(input_1) in_2 = open(input_2) out_1 = open(output_1, "w") out_2 = open(output_2, "w") # Read one line from each 'in_' file # Do many operations on the DNA sequences included in the input files # Append one line to each 'out_' file in_1.close() in_2.close() out_1.close() out_2.close() The files are huge (each potentially approaching 1Go, that is why I am reading through these input files one at a time. I am guessing that this is not a very Pythonic way to do things. :) Would using the following form good? with open("file1") as f1: with open("file2") as f2: # etc. If yes, could I do it while avoiding the highly indented code that would result? Thanks for the insights!

    Read the article

  • JQuery plugin: catch events for clicking/tabbing into and out of an input box

    - by poswald
    I'm creating a Javascript JQuery Timepicker control plugin (which I hope to open source soon) and I would like some advice on how to best register the events in the cleanest way. The control will attach to an <input> box and provide a graphical way to enter times of day ( 14:25, 2:45 AM, etc...). It does this by adding a <div> after the input box. What I want is to bind an openControl() function that fires when the input is clicked or tabbed to, and a closeControl() function that fires when the input box is tabbed away from or deselected but not if the control itself is clicked. That is, I don't want to close the control if you're clicking inside of the control's <input> or the <div>. Here's what I have been doing to try to get there: /* Close the control attached to the passed inputNode */ function closeContainer(inputNode, options) { $input = $(inputNode); if ( $input.next().is(':visible')) { $input.next().hide(options.hideAnim, options.hideOptions, options.hideDuration, options.onHide ); } } /* Open the control */ function openContainer(node, options) { $input = $(node); $input.next().show(options.showAnim, options.showOptions, options.showDuration, options.onShow ); // bind a click handler for closing the contol $("body").bind('click', function (e) { $('.time-control').each( function () { $input = $(this).prev(); // only close if click is outside of the control or the input box if (jQuery.contains(this, e.target) || ($input.get(0) === e.target) ) { closeContainer($input, options); setTime($input, $input.next(), options); } else { closeContainer($input, options); } }); }); } I want to add support for tabbing in/out but I feel like this approach is wrong. Focus/Blur wasn't working well because the blur event fires if you click on the control. Should I be using those events but filtering out if they are inside the control's div? Anyone have a better way of doing this? Thanks!

    Read the article

  • SQL SERVER – Attach mdf file without ldf file in Database

    - by pinaldave
    Background Story: One of my friends recently called up and asked me if I had spare time to look at his database and give him a performance tuning advice. Because I had some free time to help him out, I said yes. I asked him to send me the details of his database structure and sample data. He said that since his database is in a very early stage and is small as of the moment, so he told me that he would like me to have a complete database. My response to him was “Sure! In that case, take a backup of the database and send it to me. I will restore it into my computer and play with it.” He did send me his database; however, his method made me write this quick note here. Instead of taking a full backup of the database and sending it to me, he sent me only the .mdf (primary database file). In fact, I asked for a complete backup (I wanted to review file groups, files, as well as few other details).  Upon calling my friend,  I found that he was not available. Now,  he left me with only a .mdf file. As I had some extra time, I decided to checkout his database structure and get back to him regarding the full backup, whenever I can get in touch with him again. Technical Talk: If the database is shutdown gracefully and there was no abrupt shutdown (power outrages, pulling plugs to machines, machine crashes or any other reasons), it is possible (there’s no guarantee) to attach .mdf file only to the server. Please note that there can be many more reasons for a database that is not getting attached or restored. In my case, the database had a clean shutdown and there were no complex issues. I was able to recreate a transaction log file and attached the received .mdf file. There are multiple ways of doing this. I am listing all of them here. Before using any of them, please consult the Domain Expert in your company or industry. Also, never attempt this on live/production server without the presence of a Disaster Recovery expert. USE [master] GO -- Method 1: I use this method EXEC sp_attach_single_file_db @dbname='TestDb', @physname=N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf' GO -- Method 2: CREATE DATABASE TestDb ON (FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf') FOR ATTACH_REBUILD_LOG GO Method 2: If one or more log files are missing, they are recreated again. There is one more method which I am demonstrating here but I have not used myself before. According to Book Online, it will work only if there is one log file that is missing. If there are more than one log files involved, all of them are required to undergo the same procedure. -- Method 3: CREATE DATABASE TestDb ON ( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf') FOR ATTACH GO Please read the Book Online in depth and consult DR experts before working on the production server. In my case, the above syntax just worked fine as the database was clean when it was detached. Feel free to write your opinions and experiences for it will help the IT community to learn more from your suggestions and skills. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, Readers Question, SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Unable to delete file from read-only file system

    - by tech
    I cannot delete files, when I tried to delete the file, this message appeard: "Error while deleting. There was an error deleting file2.rar." This is what I get from clicking on SHOW MORE DETAILS: "Error removing file: Read-only file system" This is what I've tried so far: delete from the downloads directory from my user. deleting permanently, but it does not work. EDIT When trying to remount it as read/write I get this error: jianyue@jianyue:~$ sudo mount -o remount,rw /media/A88F-8788 [sudo] password for jianyue: sudo: unable to open /var/lib/sudo/jianyue/1: Read-only file system mount: can't find /media/A88F-8788 in /etc/fstab or /etc/mtab

    Read the article

  • Clean file separators in Ruby without File.join

    - by kerry
    I love anything that can be done to clean up source code and make it more readable.  So, when I came upon this post, I was pretty excited.  This is precisely the kind of thing I love. I have never felt good about ‘file separator’ strings b/c of their ugliness and verbosity. In Java we have: 1: String path = "lib"+File.separator+"etc"; And in Ruby a popular method is: 1: path = File.join("lib","etc") Now, by overloading the ‘/’ operator on a String in Ruby: 1: class String 2: def /(str_to_join) 3: File.join(self, str_to_join) 4: end 5: end We can now write: 1: path = 'lib'/'src'/'main' Brilliant!

    Read the article

  • Cannot delete .Trash-503 directory, returns a $RECYCLE.BIN.trashinfo: Input/output error

    - by Parto
    I cannot delete .Trash-503 folder via GUI or terminal, it returns a $RECYCLE.BIN.trashinfo: Input/output error Not even sudo rm -r or even a simple ls works in that trash directory. Check terminal output below: subroot@subroot:~$ cd /media/xxxxx/ subroot@subroot:/media/xxxxx$ rm .Trash-503/ rm: cannot remove `.Trash-503/': Is a directory subroot@subroot:/media/xxxxx$ rm -r .Trash-503/ rm: cannot remove `.Trash-503/info/$RECYCLE.BIN.trashinfo': Input/output error rm: cannot remove `.Trash-503/info/found.000.trashinfo': Input/output error rm: cannot remove `.Trash-503/info': Directory not empty subroot@subroot:/media/xxxxx$ sudo rm -r .Trash-503/ [sudo] password for subroot: rm: cannot remove `.Trash-503/info/$RECYCLE.BIN.trashinfo': Input/output error rm: cannot remove `.Trash-503/info/found.000.trashinfo': Input/output error subroot@subroot:/media/xxxxx$ cd .Trash-503/ subroot@subroot:/media/xxxxx/.Trash-503$ ls info subroot@subroot:/media/xxxxx/.Trash-503$ cd info/ subroot@subroot:/media/xxxxx/.Trash-503/info$ ls ls: cannot access $RECYCLE.BIN.trashinfo: Input/output error ls: cannot access found.000.trashinfo: Input/output error found.000.trashinfo $RECYCLE.BIN.trashinfo subroot@subroot:/media/xxxxx/.Trash-503/info$ What's going on here and how can I delete this folder? EDIT I tried checking and repairing the partition using gparted only to get this error message: ERROR: Filesystem check failed! ERROR: 264 clusters are referenced multiple times. NTFS is inconsistent. Run chkdsk /f on Windows then reboot it TWICE! The usage of the /f parameter is very IMPORTANT! No modification was and will be made to NTFS by this software until it gets repaired. I don't have windows installed, how can I run chkdsk /f from ubuntu?

    Read the article

  • Design of input files reading when it comes to defaults/transformations

    - by Stefano Borini
    Suppose you have an application that reads an input file, on a language that does not support the concept of None. The input is read, parsed, and the contents are stored on a structure for later use. Now, in general you want to keep into account transformation of the data from the input, such as adding default values when not specified, or adding full path information to relative path specified in the input. There are two different strategies to achieve this. The first strategy is to perform these transformations at input file reading time. In practice, you put all the intelligence into the input parser, and your application has no logic to deal with unexpected circumstances, such as an unspecified value. You lose the information of what was specified and what wasn't, but you gain in black-boxing the details. Your "running code" needs that information in any case and in a proper form, and is not concerned if it's the default or a user-specified information. The second strategy is to have the file reader a real one-to-one mapper from the file to a memory-stored object, with no intelligent behavior. unspecified values are not filled (which may however be a problem in languages not supporting None) and data is stored verbatim from the file. The intelligence for recovery must now go into the "running code", which must check what was specified in the file, eventually fall back to a default, or modify the input properly before using it. I would like to know your opinion on these two approaches, and in particular which one you found the most frequently implemented.

    Read the article

  • File property information (last write time and file size) in explorer out of date by hours over netw

    - by David L Morris
    An application is running on a windows XP prof machine picking up file from a network share from another windows machine. It detects that the file has been updated (by date and time or optionally file size) and reads it for any new data. Most of the time the last write time and file size, seems to be up to date. Occasionally, this information stops being updated, even though the file is growing (intermittently during the day) with appended content, so that the last write time and file size remain fixed at some arbitrary moment. This is visible in explorer, where it shows a fixed last write time on the reading machine. Just opening the file to edit it in notepad, immediately refreshes the file properties, and the other application picks up where it left of. The file location can't be changed, nor the location of the relevant applications. Any solutions to resolve this problem?

    Read the article

  • Do input template languages exist?

    - by marczellm
    When I have to create some textual representation of data, I can use a template language, so that my code does not have to worry about the structure of the output file - I can sometimes even write code that's independent of whether the output is XML, LaTeX or any other plain text. A simple example: Template (in separate file): <someXmlTag> $variableName </someXmlTag> Code: Template(temstring).substitute(variableName="value") Result (written to output file): <someXmlTag> value </someXmlTag> I want to do the same, but in the opposite direction. I have XML or plain text or whatever files to input. I want to describe the input structure in a separate file that looks like the input but has these variable declarations in it, and I want to handle it with code that's independent of the structure. Is there a library for this concept? (We usually handle XML input by using an XML parser library to describe the input structure in program code, handle plaintext input by writing regexes in code, and don't handle LaTeX input because LaTeX can't really be parsed.)

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >