Search Results

Search found 70026 results on 2802 pages for 'file recovery'.

Page 182/2802 | < Previous Page | 178 179 180 181 182 183 184 185 186 187 188 189  | Next Page >

  • Join multiple consecutive SQLite database dump files into 1 common database? Purpose: Search through ENTIRE Chrome Browsing History

    - by porg
    Google Chrome 's default web browsing history search engine only lets you access the records of the recent 100 days. Nevertheless in your application data, Chrome keeps your entire browsing history in SQLite database files, with the file naming scheme of "History Index YYYY-MM". I am looking for a way to search… …through my entire browsing history, …with sophisticated filters (limit search terms to certain fields such as URL, domain, title, body text; wildcard or regex terms, date ranges). … in … …either some ready-made software. eHistory came close, as it can limit terms to fields, but it lacks wildcards/regexes, and has the same limited time horizon as the default search. Beyond that, I could not find any suited Chrome extension or standalone (Mac) app. …or a command line to join multiple SQLite database files into one database, which I can then query (with the full syntax power). In the spirit of the pseudo code below: Preferred this way: sqlite --targetDatabase ChromeHistoryAll --importFiles /path/to/ChromeAppData/History\ Index* --importOnlyYetUnknownFiles Or if my desired feature --importOnlyYetUnknownFiles is not possible (feature could also be called "avoid duplicate imports by checking UIDs"), then by explicitly only importing files, of which I know, that they have yet not been imported into the ChromeHistoryAll database: cd ChromeAppData; sqlite --databaseTarget ChromeHistoryAll --importFiles YetNotImported1 YetNotImported2 YetNotImported3 All my queries I would then perform in the database "ChromeHistoryAll" P.S.: Additional question of general interest: Is there a way to perform a database query in a temporary database which was created on-the-fly from multiple files? Like: sqlite --query="SQL query" --targetDatabase DbAll --DBtemporaryInRAM --importFiles db1 db2 db3 This is surely not applicable for my Chrome question, as these History Index files have a combined file size of 500MB together, thus such a query would be of bad performance. But it could come handy in other situations.

    Read the article

  • Microsoft Word 2008 on the Mac sometimes "Disappears" documents, really.

    - by Ross Charette
    This happens in a computer lab environment, has happened at least 3 times. We are running Microsoft Office 2008 for mac on Leopard, everything is updated. Our user's home directories are on a network drive, but the /Library/Cache folder is running locally. Typically a student will have a Word file that they have been working on, it's been saved before they even logged onto the computer that day. They log on, open the document, click the save icon (not go to File Save), sometimes even save multiple times, then close Word. The document is now gone. It's not hidden, there are no autosaves or anything in the Cache folder. Definitely not in the trash or trashes folder. It can't find it when you click on it in 'recent documents'. Searching meticulously though every folder in their home drive turns up nothing. They look using Finder, I look ssh'd as root into their home using ls -la. I look for similar files in case they renamed it by mistake. It's gone. Disappeared. Vaporized. It's happened to at least 3 different users in the past year. Much whining. Any idea?

    Read the article

  • Folder Sharing NTFS permissions with Share Permission

    - by Muhammad Adly
    i have a problem on my domain, the history starting from when i had a server with WIN 2008 r2 installed with the following roles installed on it (AD, DNS, DHCP, File). From 1 month i decided to install a new server 2008 r2 server to get (AD, DNS, DHCP) and leave the file server on the old one. i did the following exactly: 1) robocopy all my data on external HDD 2) Install a new server with 2008 r2 3) transfer all 5 roles to transfer the domain to the new server (MainDC) 4) issue (NETLOGON, SYSVOL) not transferred but i decided to reinitialize them again an now they are operating (MainDC) 5) re-create and re-configure a new GPOs and link it to my OUs 6) reinstall Old server operating system with a fresh installation of WIN 2008 R2 (FileServer) 7) join my domain with my domain credentials. the issue when i tried to share folder on \fileserver the permissions that i had set in sharing permissions are applied on the main shared folder and subfolders. the security settings are not applied. i.e. Say i'm sharing \fileserver\MainFolder with sharing permission for Authenticated Users that can read, so every one can read this main shared folder, if i set security permission for \fileserver\MainFolder\User1 that User1 can Read\Write\Modify. User1 can not perform this processes when accessing it from Network Share, i tried alot of steps from topics online get ownership for folder, remove inheritance from parent folder, applying changes for child objects, i tried also to construct a new folder structure but also the same issue, i tried another host PC, also i get the same issue.

    Read the article

  • 3 Server, is this a cluster scenario?

    - by HornedBeast
    Hello, At the moment I have one Ubuntu server, 9.10, running with a simple Samba share, a mail server, DNS server and DHCP server. Mostly its just there for file sharing and email server. I also have 2 other servers that are exactly the same hardware and spec as the first, which have an rsync set up to retrieve the shared folders and backs them up. However, if the first server goes down, all of our shares disappear along with our mail and the system must be rebuilt. Also I tend to find if people are downloading a large amount from the file server, no-one can access there emails - especially in the morning when everyone is signing in at once. Would it be more beneficial for me to have all 3 servers, all running the same services, doing the same thing with some sort of cluster with load balancing? In short, how can I get the best out of my 3 hardware servers? I'm not really sure where to begin looking, or how to go about such a setup where 3 servers are all identical, but perhaps one acts as the main load balancer?? If someone can point me in the right direction, or if this simply sounds like one of those Enterprise Cloud's that is now a default setup in Ubuntu Server 9.10+, then I'll go down that route. Cheers in advance. Andy

    Read the article

  • Ruby on rails: Image downloads with Authentication/Authorization/Time outs

    - by ak1dnar
    Hi Guys, I'm having few doubts on implementing file downloads. I'm creating an app where I use attachment_fu with Amazon s3 to upload files. Things are working pretty well so far on uploading side. Now its the time to start the file downloads. Here is what I need, a logged in user search and browse for Images and they should able to add the files in to a download basket (Let's say its a Download Shopping Cart). Finally the user should be able to download these file(s) from S3 probably as a zipped file. Is there any plugin/gem where I can use for this?

    Read the article

  • Adding an expression based image in a client report definition file (RDLC)

    - by rajbk
    In previous posts, I showed you how to create a report using Visual Studio 2010 and how to add a hyperlink to the report.  In this post, I show you how to add an expression based image to each row of the report. This similar to displaying a checkbox column for Boolean values.  A sample project is attached to the bottom of this post. To start off, download the project we created earlier from here.  The report we created had a “Discontinued” column of type Boolean. We are going to change it to display an “available” icon or “unavailable” icon based on the “Discontinued” row value.    Load the project and double click on Products.rdlc. With the report design surface active, you will see the “Report Data” tool window. Right click on the Images folder and select “Add Image..”   Add the available_icon.png and discontinued_icon.png images (the sample project at the end of this post has the icon png files)    You can see the images we added in the “Report Data” tool window.   Drag and drop the available_icon into the “Discontinued” column row (not the header) We get a dialog box which allows us to set the image properties. We will add an expression that specifies the image to display based the “Discontinued” value from the Product table. Click on the expression (fx) button.   Add the following expression : = IIf(Fields!Discontinued.Value = True, “discontinued_icon”, “available_icon”)   Save and exit all dialog boxes. In the report design surface, resize the column header and change the text from “Discontinued” to “In Production”.   (Optional) Right click on the image cell (not header) , go to “Image Properties..” and offset it by 5pt from the left. (Optional) Change the border color since it is not set by default for image columns. We are done adding our image column! Compile the application and run it. You will see that the “In Production” column has red ‘x’ icons for discontinued products. Download the VS 2010 sample project NorthwindReportsImage.zip Other Posts Adding a hyperlink in a client report definition file (RDLC) Rendering an RDLC directly to the Response stream in ASP.NET MVC ASP.NET MVC Paging/Sorting/Filtering using the MVCContrib Grid and Pager Localization in ASP.NET MVC 2 using ModelMetadata Setting up Visual Studio 2010 to step into Microsoft .NET Source Code Running ASP.NET Webforms and ASP.NET MVC side by side Pre-filtering and shaping OData feeds using WCF Data Services and the Entity Framework

    Read the article

  • Save gcc compile status to a text file for Java

    - by JohnBore
    I'm making a C Assessment Program through Java, which has a bunch of programming questions for C, and it lets the user input an answer in the form of C code, and then press a "Compile" button, which is linked to a bat file that runs the user input code through gcc. I've got the input and compiling working, but I need to get the output from the compiler and get that to print textarea within the program. I can get a simple "Hello, world" compiling, but I'm having trouble getting programs that require a user input with scanf, for example, to be printed. else if(e.getSource().equals(compile)){ if(questionNumber<1){ JOptionPane.showMessageDialog(programFrame, "Please start the assessment", "Compile Error", JOptionPane.ERROR_MESSAGE); } else{ FileOutputStream fileWrite; try { fileWrite = new FileOutputStream("demo/demo.c"); new PrintStream(fileWrite).println(input.getText());//saves what the user has entered in to a C source file fileWrite.close(); @SuppressWarnings("unused") Process process = Runtime.getRuntime().exec("cmd /c compile.bat");//runs the batch file to compile the source file compileCode(); try{ fileStream = new FileInputStream("demo/output.txt"); inputStream = new DataInputStream(fileStream); bufferRead = new BufferedReader(new InputStreamReader(inputStream)); while((stringLine = bufferRead.readLine())!=null){ compiled.append(stringLine); compiled.append("\n"); } inputStream.close(); } catch(IOException exc){ System.err.println("Unable to read file"); System.exit(-1); } } catch (IOException exc) { JOptionPane.showMessageDialog(programFrame, "Demo file not found", "File Error", JOptionPane.ERROR_MESSAGE); } } This is the actionPerformed method for the "Compile" button, the compileCode() is the JFrame that displays the output and "compiled" is the textArea for the output. My batch file is: C: cd dev-cpp\bin gcc.exe H:\workspace\QuestionProgram\demo\demo.c -o demo > H:\workspace\QuestionProgram\demo\compilestatus.txt demo > H:\workspace\QuestionProgram\demo\output.txt I'm not sure how I can do it, so the frame is created for the output of the code if the code requires a user input as the command prompt doesn't open without adding "START" to .exec(), but then the frame appears before the program has finished running. Also, how would I get the output of the compiler if the compile fails because of an error? The way I've got it in my batch file at the moment doesn't put anything in a text file if it fails.

    Read the article

  • Android: How to copy a SQLite database from one application to another

    - by mahdaeng
    I have a lite version of an application that uses a SQLite database. I want to copy that database over to the full version of the application when the user installs the full version. I have written some code to perform the file copy, but the lite database file always comes up as unreadable. The file is there and I can point to it, but I can't read it to perform the copy. In the Android documentation, we read: You can save files directly on the device's internal storage. By default, files saved to the internal storage are private to your application and other applications cannot access them (nor can the user). Note the words, "by default". Is there a way that I can override that default and make the SQLite file readable by my other application? Thank you.

    Read the article

  • Windows Azure: Creation of a file on cloud blob container

    - by veda
    I am writing a program that will be executing on the cloud. The program will generate an output that should be written on to a file and the file should be saved on the blob container. I don't have a idea of how to do that Will this code FileStream fs = new FileStream(file, FileMode.Create); StreamWriter sw = new StreamWriter(fs); generate a file named "file" on the cloud...

    Read the article

  • Jumbled byte array after using TcpClient and TcpListener

    - by Dylan
    I want to use the TcpClient and TcpListener to send an mp3 file over a network. I implemented a solution of this using sockets, but there were some issues so I am investigating a new/better way to send a file. I create a byte array which looks like this: length_of_filename|filename|file This should then be transmitted using the above mentioned classes, yet on the server side the byte array I read is completely messed up and I'm not sure why. The method I use to send: public static void Send(String filePath) { try { IPEndPoint endPoint = new IPEndPoint(Settings.IpAddress, Settings.Port + 1); Byte[] fileData = File.ReadAllBytes(filePath); FileInfo fi = new FileInfo(filePath); List<byte> dataToSend = new List<byte>(); dataToSend.AddRange(BitConverter.GetBytes(Encoding.Unicode.GetByteCount(fi.Name))); // length of filename dataToSend.AddRange(Encoding.Unicode.GetBytes(fi.Name)); // filename dataToSend.AddRange(fileData); // file binary data using (TcpClient client = new TcpClient()) { client.Connect(Settings.IpAddress, Settings.Port + 1); // Get a client stream for reading and writing. using (NetworkStream stream = client.GetStream()) { // server is ready stream.Write(dataToSend.ToArray(), 0, dataToSend.ToArray().Length); } } } catch (ArgumentNullException e) { Debug.WriteLine(e); } catch (SocketException e) { Debug.WriteLine(e); } } } Then on the server side it looks as follows: private void Listen() { TcpListener server = null; try { // Setup the TcpListener Int32 port = Settings.Port + 1; IPAddress localAddr = IPAddress.Parse("127.0.0.1"); // TcpListener server = new TcpListener(port); server = new TcpListener(localAddr, port); // Start listening for client requests. server.Start(); // Buffer for reading data Byte[] bytes = new Byte[1024]; List<byte> data; // Enter the listening loop. while (true) { Debug.WriteLine("Waiting for a connection... "); string filePath = string.Empty; // Perform a blocking call to accept requests. // You could also user server.AcceptSocket() here. using (TcpClient client = server.AcceptTcpClient()) { Debug.WriteLine("Connected to client!"); data = new List<byte>(); // Get a stream object for reading and writing using (NetworkStream stream = client.GetStream()) { // Loop to receive all the data sent by the client. while ((stream.Read(bytes, 0, bytes.Length)) != 0) { data.AddRange(bytes); } } } int fileNameLength = BitConverter.ToInt32(data.ToArray(), 0); filePath = Encoding.Unicode.GetString(data.ToArray(), 4, fileNameLength); var binary = data.GetRange(4 + fileNameLength, data.Count - 4 - fileNameLength); Debug.WriteLine("File successfully downloaded!"); // write it to disk using (BinaryWriter writer = new BinaryWriter(File.Open(filePath, FileMode.Append))) { writer.Write(binary.ToArray(), 0, binary.Count); } } } catch (Exception ex) { Debug.WriteLine(ex); } finally { // Stop listening for new clients. server.Stop(); } } Can anyone see something that I am missing/doing wrong?

    Read the article

  • How to read data from keyboard and store it in a file, shellscript

    - by Sunil Kumar Sahoo
    Hi I have a file try.SPEC. This file contains a word "Version: 1.0.0.1" . Now I want to write a shell script which will read the version number from keyboard and insert in the file. eg- if the user enters the version number as 2.1.1.1 then the file will have Version: 2.1.1.1" instead of "Version: 1.0.0.1". like this i want that i must be able to change irrespective of knowing what is the version number present in the spec file Thanks Sunil Kumar Sahoo

    Read the article

  • Should a HTTP POST'ed file be base64 encoded?

    - by andybee
    I'm currently implementing a client application that POST's a file over HTTP and have implemented base64 encoding on the file's data parameter. However, it appears that when inspecting the traffic between a simple HTML page with a file upload form and the server that no Content-Transfer-Encoding header is sent in the body when describing the file's parameter. Is this the preferred way of POST'ing a file over HTTP?

    Read the article

  • Sharing a file from Android to Gmail or to Dropbox

    - by Calaf
    To share a simple text file, I started by copying verbatim from FileProvider's manual page: <application android:allowBackup="true" android:icon="@drawable/ic_launcher" android:label="@string/app_name" android:theme="@style/AppTheme" > <provider android:name="android.support.v4.content.FileProvider" android:authorities="com.mycorp.helloworldtxtfileprovider.MainActivity" android:exported="false" android:grantUriPermissions="true" > <meta-data android:name="android.support.FILE_PROVIDER_PATHS" android:resource="@xml/my_paths" /> </provider> <activity android:name="com.mycorp.helloworldtxtfileprovider.MainActivity" ... Then I saved a text file and used, again nearly verbatim, the code under Sending binary content. (Notice that this applies more accurately in this case than "Sending text content" since we are sending a file, which happens to be a text file, rather than just a string of text.) For the convenience of duplication on your side, and since the code is in any case so brief, I'm including it here in full. public class MainActivity extends Activity { @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); String filename = "hellow.txt"; String fileContents = "Hello, World!\n"; byte[] bytes = fileContents.getBytes(); FileOutputStream fos = null; try { fos = this.openFileOutput(filename, MODE_PRIVATE); fos.write(bytes); } catch (IOException e) { e.printStackTrace(); } finally { try { fos.close(); } catch (IOException e) { e.printStackTrace(); } } File file = new File(filename); Intent shareIntent = new Intent(); shareIntent.setAction(Intent.ACTION_SEND); shareIntent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(file)); shareIntent.setType("application/txt"); startActivity(Intent.createChooser(shareIntent, getResources().getText(R.string.send_to))); file.delete(); } } Aside from adding a value for send_to in res/values/strings.xml, the only other change I did to the generic Hello, World that Eclipse creates is to add the following in res/xml/my_paths.xml (as described on the page previously referenced. <paths xmlns:android="http://schemas.android.com/apk/res/android"> <Files-path name="files" path="." /> </paths> This code runs fine. It shows a list of intent recipients. But sending the text file to either Dropbox or to Gmail fails. Dropbox sends the notification "Uploading to Dropbox" followed by "Upload failed: my_file.txt". After "sending message.." Gmail sends "Couldn't send attachment". What is wrong?

    Read the article

  • Infinite loop when using fscanf

    - by user1409641
    I wrote this simple program in C, because I'm studying FILES right now at University. I take a txt file with a list of the results of the last race so my program will show the data formatted as I want. Here's my code: /* Esercizio file Motogp */ #include <stdio.h> #define SIZE 20 int main () { int pos, punt, num; float kmh; char nome[SIZE+1], cognome[SIZE+1], moto[SIZE+1]; char naz[SIZE+1], nome_file[SIZE+1]; FILE *fp; printf ("Inserisci il nome del file da aprire: "); gets (nome_file); fp = fopen (nome_file, "r"); if (fopen == NULL) printf ("Errore nell' apertura del file %s\n", nome_file); else { while (fscanf (fp, "%d %d %d %s %s %s %s %.2f", &pos, &punt, &num, nome, cognome, naz, moto, &kmh) != EOF ) { printf ("Posizione di arrivo: %d\n", pos); printf ("Punteggio: %d\n", punt); printf ("Numero pilota: %d\n", num); printf ("Nome pilota: %s\n", nome); printf ("Cognome pilota: %s\n", cognome); printf ("Nazione: %s\n", naz); printf ("Moto: %s\n", moto); printf ("Media Kmh: %d\n\n", kmh); } } fclose(fp); return 0; } and there's my txt file: 1 25 99 Jorge LORENZO SPA Yamaha 164.4 2 20 26 Dani PEDROSA SPA Honda 164.1 3 16 4 Andrea DOVIZIOSO ITA Yamaha 163.8 4 13 1 Casey STONER AUS Honda 163.8 5 11 35 Cal CRUTCHLOW GBR Yamaha 163.6 6 10 19 Alvaro BAUTISTA SPA Honda 163.5 7 9 46 Valentino ROSSI ITA Ducati 163.3 8 8 6 Stefan BRADL GER Honda 162.9 9 7 69 Nicky HAYDEN USA Ducati 162.5 10 6 11 Ben SPIES USA Yamaha 162.3 11 5 8 Hector BARBERA SPA Ducati 162.1 12 4 17 Karel ABRAHAM CZE Ducati 160.9 13 3 41 Aleix ESPARGARO SPA ART 160.2 14 2 51 Michele PIRRO ITA FTR 160.1 15 1 14 Randy DE PUNIET FRA ART 160.0 16 0 77 James ELLISON GBR ART 159.9 17 0 54 Mattia PASINI ITA ART 159.4 18 0 68 Yonny HERNANDEZ COL BQR 159.4 19 0 9 Danilo PETRUCCI ITA Ioda 158.2 20 0 22 Ivan SILVA SPA BQR 158.2 When I run my program, it return me an infinite loop of the first one. Why? Is there another function to read those data?

    Read the article

  • stdio's remove() not always deleting on time.

    - by Kyte
    For a particular piece of homework, I'm implementing a basic data storage system using sequential files under standard C, which cannot load more than 1 record at a time. So, the basic part is creating a new file where the results of whatever we do with the original records are stored. The previous file's renamed, and a new one under the working name is created. The code's compiled with MinGW 5.1.6 on Windows 7. Problem is, this particular version of the code (I've got nearly-identical versions of this floating around my functions) doesn't always remove the old file, so the rename fails and hence the stored data gets wiped by the fopen(). FILE *archivo, *antiguo; remove("IndiceNecesidades.old"); // This randomly fails to work in time. rename("IndiceNecesidades.dat", "IndiceNecesidades.old"); // So rename() fails. antiguo = fopen("IndiceNecesidades.old", "rb"); // But apparently it still gets deleted, since this turns out null (and I never find the .old in my working folder after the program's done). archivo = fopen("IndiceNecesidades.dat", "wb"); // And here the data gets wiped. Basically, anytime the .old previously exists, there's a chance it's not removed in time for the rename() to take effect successfully. No possible name conflicts both internally and externally. The weird thing's that it's only with this particular file. Identical snippets except with the name changed to Necesidades.dat (which happen in 3 different functions) work perfectly fine. // I'm yet to see this snippet fail. FILE *antiguo, *archivo; remove("Necesidades.old"); rename("Necesidades.dat", "Necesidades.old"); antiguo = fopen("Necesidades.old", "rb"); archivo = fopen("Necesidades.dat", "wb"); Any ideas on why would this happen, and/or how can I ensure the remove() command has taken effect by the time rename() is executed? (I thought of just using a while loop to force call remove() again so long as fopen() returns a non-null pointer, but that sounds like begging for a crash due to overflowing the OS with delete requests or something.)

    Read the article

  • how to create thumbnail system for MP4 files

    - by Pete Herbert Penito
    Hi everyone! I know I know, why am I using MP4 still?? It's because I have like 100 files already in this format and I need to upload to a website, I have the mp4 file embeded in the site already and the file played changes according to php. but what I really need is a way to dynamically create a thumbnail or take a snapshot of the video file to display on the page. I've read a couple things online but they all require the file type to be in FLV, what would be the best way to accomplish this? Thank you Guys!

    Read the article

  • Java Oracle Installation /usr/bin/java: cannot execute binary file

    - by Dave
    Hi I've been trying for 1 day to get Oracle Java running on Ubuntu. I have a powermac g5 with Ubuntu 12.04 ppc64. uname -a : Linux LK37 3.2.0-53-powerpc64-smp #81-Ubuntu SMP Thu Aug 22 21:17:14 UTC 2013 ppc64 ppc64 ppc64 GNU/Linux lspci: david@LK37:~$ sudo lspc [sudo] password for david: sudo: lspc: command not found david@LK37:~$ sudo lspci 0000:00:0b.0 PCI bridge: Apple Inc. CPC945 PCIe Bridge 0000:0a:00.0 VGA compatible controller: NVIDIA Corporation NV43 [GeForce 6600 LE] (rev a2) 0001:00:00.0 Host bridge: Apple Inc. U4 HT Bridge 0001:00:01.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-X bridge (rev a3) 0001:00:02.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-X bridge (rev a3) 0001:00:03.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:04.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:05.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:06.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:07.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:00:08.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:00:09.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:01:07.0 Unassigned class [ff00]: Apple Inc. Shasta Mac I/O 0001:01:0b.0 USB controller: NEC Corporation OHCI USB Controller (rev 43) 0001:01:0b.1 USB controller: NEC Corporation OHCI USB Controller (rev 43) 0001:01:0b.2 USB controller: NEC Corporation uPD72010x USB 2.0 Controller (rev 04) 0001:03:0c.0 IDE interface: Broadcom K2 SATA 0001:03:0d.0 Unassigned class [ff00]: Apple Inc. Shasta IDE 0001:03:0e.0 FireWire (IEEE 1394): Apple Inc. Shasta Firewire 0001:05:04.0 Ethernet controller: Broadcom Corporation NetXtreme BCM5780 Gigabit Ethernet (rev 03) 0001:05:04.1 Ethernet controller: Broadcom Corporation NetXtreme BCM5780 Gigabit Ethernet (rev 03) david@LK37:~$ I tried various ways to install Oracle Java but I always end up with: bash: /usr/bin/java: cannot execute binary file At the moment I have Installed jdk-7u25-linux-x64.tar.gz in /usr/lib/jvm/jdk1.7.0/bin/ as said in this post I already tried the web install but I get a 404 error. I hope you can help me. I started using Ubuntu yesterday so please give me the complete terminal code, it will be a lot easier for me. For those who care I want to play Minecraft and with the OpenJDK I got a java.lang error. That's why I want to install Oracle Java.

    Read the article

  • Using perl to parse a file and insert specific values into a database

    - by Sean
    Disclaimer: I'm a newbie at scripting in perl, this is partially a learning exercise (but still a project for work). Also, I have a much stronger grasp on shell scripting, so my examples will likely be formatted in that mindset (but I would like to create them in perl). Sorry in advance for my verbosity, I want to make sure I am at least marginally clear in getting my point across I have a text file (a reference guide) that is a Word document converted to text then swapped from Windows to UNIX format in Notepad++. The file is uniform in that each section of the file had the same fields/formatting/tables. What I have planned to do, in a basic way is grab each section, keyed by unique batch job names and place all of the values into a database (or maybe just an excel file) so all the fields can be searched/edited for each job much easier than in the word file and possibly create a web interface later on. So what I want to do is grab each section by doing something like: sed -n '/job_name_1_regex/,/job_name_2_regex/' file.txt --how would this be formatted within a perl script? (grab the section in total, then break it down further from there) To read the file in the script I have open FORMAT_FILE, 'test_format.txt'; and then use foreach $line (<FORMAT_FILE>) to parse the file line by line. --is there a better way? My next problem is that since I converted from a word doc with tables, which looks like: Table Heading 1 Table Heading 2 Heading 1/Value 1 Heading 2/Value 1 Heading 1/Value 2 Heading 2/Value 2 but the text file it looks like: Table Heading 1 Table Heading 2Heading 1/Value 1Heading 1/Value 2Heading 2/Value 1Heading 2/Value 2 So I want to have "Heading 1" and "Heading 2" as a columns name and then put the respective values there. I just am not sure how to get the values in relation to the heading from the text file. The values of Heading 1 will always be the line number of Heading 1 plus 2 (Heading 1, Heading 2, Values for heading 1). I know this can be done in awk/sed pretty easily, just not sure how to address it inside a perl script. After I have all the right values and such, linking it up to a database may be an issue as well, I haven't started looking at the way perl interacts with DBs yet. Sorry if this is a bit scatterbrained...it's still not fully formed in my head.

    Read the article

  • Use $_FILES on a page called by .ajax

    - by RachelD
    I have two .php pages that I'm working with. Index.php has a file upload form that posts back to index.php. I can access the $_FILES no problem on index.php after submitting the form. My issue is that I want (after the form submit and the page loads) to use .ajax (jQuery) to call another .php file so that file can open and process some of the rows and return the results to ajax. The ajax then displays the results and recursively calls itself to process the next batch of rows. Basically I want to process (put in the DB etc) the csv in chunks and display it for the user in between chunks. Im doing it this way because the files are 400,000+ rows and the user doesnt want to wait the 10+ min for them all to be processed. I dont want to move this file (save it) because I just need to process it and throw it away and if a user closes the page while its processing the file wont be thrown away. I could cron script it but I dont want to. What I would really like to do is pass the (single) $_FILES through .ajax OR Save it in a $_POST or $_SESSION to use on the second page. Is there any hope for my cause? Heres the ajax code if that helps: function processCSV(startIndex, length) { $.ajax({ url: "ajax-targets/process-csv.php", dataType: "json", type: "POST", data: { startIndex: startIndex, length: length }, timeout: 60000, // 1000 = 1 sec success: function(data) { // JQuery to display the rows from the CSV var newStart = startIndex+length; if(newStart <= data['csvNumRows']) { processCSV(newStart, length); } } }); } processCSV(1, 2); }); P.S. I did try this Passing $_FILES or $_POST to a new page with PHP but its not working for me :( SOS.

    Read the article

  • Persist changes in C

    - by Mohit Deshpande
    I am developing a database-like application that stores a a structure containing: struct Dictionary { char *key; char *value; struct Dictionary *next; }; As you can see, I am using a linked list to store information. But the problem begins when the user exits out of the program. I want the information to be stored somewhere. So I was thinking of storing the linked list in a permanent or temporary file using fopen, then, when the user starts the program, retrieve the linked list. Here is the method that prints the linked list to the console: void PrintList() { int count = 0; struct Dictionary *current; current = head; if (current == NULL) { printf("\nThe list is empty!"); return; } printf(" Key \t Value\n"); printf(" ======== \t ========\n"); while (current != NULL) { count++; printf("%d. %s \t %s\n", count, current->key, current->value); current = current->next; } } So I am thinking of modifying this method to print the information through fprintf instead of printf and then the program would just get the infomation from the file. Could someone help me on how I can read and write to this file? What kind of file should it be, temporary or regular? How should I format the file (like I was thinking of just having the key first, then the value, then a newline character)?

    Read the article

  • VS2008: File creation fails randomly in unit testing?

    - by Tim
    I'm working on implementing a reasonably simple XML serializer/deserializer (log file parser) application in C# .NET with VS 2008. I have about 50 unit tests right now for various parts of the code (mostly for the various serialization operations), and some of them seem to be failing mostly at random when they deal with file I/O. The way the tests are structured is that in the test setup method, I create a new empty file at a certain predetermined location, and close the stream I get back. Then I run some basic tests on the file (varying by what exactly is under test). In the cleanup method, I delete the file again. A large portion (usually 30 or more, though the number varies run to run) of my unit tests will fail at the initialize method, claiming they can't access the file I'm trying to create. I can't pin down the exact reason, since a test that will work one run fails the next; they all succeed when run individually. What's the problem here? Why can't I access this file across multiple unit tests? Relevant methods for a unit test that will fail some of the time: [TestInitialize()] public void LogFileTestInitialize() { this.testFolder = System.Environment.GetFolderPath( System.Environment.SpecialFolder.LocalApplicationData ); this.testPath = this.testFolder + "\\empty.lfp"; System.IO.File.Create(this.testPath); } [TestMethod()] public void LogFileConstructorTest() { string filePath = this.testPath; LogFile target = new LogFile(filePath); Assert.AreNotEqual(null, target); Assert.AreEqual(this.testPath, target.filePath); Assert.AreEqual("empty.lfp", target.fileName); Assert.AreEqual(this.testFolder + "\\empty.lfp.lfpdat", target.metaPath); } [TestCleanup()] public void LogFileTestCleanup() { System.IO.File.Delete(this.testPath); } And the LogFile() constructor: public LogFile(String filePath) { this.entries = new List<Entry>(); this.filePath = filePath; this.metaPath = filePath + ".lfpdat"; this.fileName = filePath.Substring(filePath.LastIndexOf("\\") + 1); } The precise error message: Initialization method LogFileParserTester.LogFileTest.LogFileTestInitialize threw exception. System.IO.IOException: System.IO.IOException: The process cannot access the file 'C:\Users\<user>\AppData\Local\empty.lfp' because it is being used by another process..

    Read the article

  • MediaElement.js setSrc() Loading The File But Not Changing pluginType

    - by doubleJ
    I'm working on a page that uses mediaelement.js to play mp3/mp4/wmv (yes, we have a lot of wmv). I have a list of links and those links should change the player. My effort is to make the changes to the player through javascript so that the page doesn't refresh. This code is working, but it refreshes every time. See a live demo of the non-ajax version. <?php $file = null; $file = $_GET["file"]; $format = null; if (preg_match("/mp4/i", $file)) $format = "mp4"; if (preg_match("/webm/i", $file)) $format = "webm"; if (preg_match("/wmv/i", $file)) $format = "wmv"; if (preg_match("/mp3/i", $file)) $format = "mp3"; if (preg_match("/ogg/i", $file)) $format = "ogg"; $mime = null; if ($format == "mp4") $mime = "video/mp4"; if ($format == "webm") $mime = "video/webm"; if ($format == "wmv") $mime = "video/wmv"; if ($format == "mp3") $mime = "audio/mp3"; if ($format == "ogg") $mime = "audio/ogg"; $element = "video"; if ($format == "mp3" || $format == "ogg") $element = "audio"; // you have to escape (\) the escape (\) character (hehehe...) $poster = "media\\120701Video.jpg"; $height = "360"; if ($format == "mp3") $height = "30"; ?> <!doctype html> <html> <head> <meta charset="utf-8"> <title>Embed</title> <link rel="stylesheet" href="include/johndyer-mediaelement-b090320/build/mediaelementplayer.min.css"> <style> audio {width:640px; height:30px;} video {width:640px; height:360px;} </style> <script src="include/johndyer-mediaelement-b090320/build/jquery.js"></script> <script src="include/johndyer-mediaelement-b090320/build/mediaelement-and-player.js"></script> </head> <body> <ul> <li><a href="embed.php">Reset</a></li> <li><a href="?file=media/120701Video-AnyVideoConverter.mp4">Alternative (mp4)</a></li> <li><a href="?file=media/120701Video-Ffmpeg-Defaults.webm">Alternative (webm)</a></li> <li><a href="?file=media/AreYouHurting-Death.wmv">Alternative (wmv)</a><li> <li><a href="?file=media/AreYouHurting-Death.mp3">Alternative (mp3)</a></li> </ul> <?php if ($file) { ?> <video src="<?php echo $file; ?>" controls poster="<?php echo $poster; ?>" width="640" height="360"></video> <div id="type"></div> <script> var video = document.getElementsByTagName("video")[0]; var player = new MediaElementPlayer(video, { success: function(player) { $('#type').html(player.pluginType); } }); <?php } ?> </script> </body> </html> This code requires <video> to be loaded, initially and with a file, so that the player mode (pluginType) is set. It will, then, only play formats that the pre-established mode supports (firefox in native mode won't play mp4). See a live demo of the ajax version. <!doctype html> <html> <head> <meta charset="utf-8"> <title>Embed</title> <link rel="stylesheet" href="http://www.mediaelementjs.com/js/mejs-2.9.2/mediaelementplayer.min.css"> <script src="//ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js"></script> <script src="http://www.mediaelementjs.com/js/mejs-2.9.2/mediaelement-and-player.js"></script> </head> <body> <ul> <li><a href="javascript:player.pause(); player.setSrc('media/120701Video-AnyVideoConverter.mp4'); player.load(); player.play();">Alternative (mp4)</a></li> <li><a href="javascript:player.pause(); player.setSrc('media/120701Video-Ffmpeg-Defaults.webm'); player.load(); player.play();">Alternative (webm)</a></li> <li><a href="javascript:player.pause(); player.setSrc('media/AreYouHurting-Death.wmv'); player.load(); player.play();">Alternative (wmv)</a></li> <li><a href="javascript:player.pause(); player.setSrc('media/AreYouHurting-Death.mp3'); player.load(); player.play();">Alternative (mp3)</a></li> </ul> <video controls src="media/WordProductionCenter.mp4"></video> <div id="type"></div> <script> var video = document.getElementsByTagName("video")[0]; var player = new MediaElementPlayer(video, { success: function(player) { $('#type').html(player.pluginType); } }); </script> </body> </html> It seems like I need something like setType(), but I see no such option. I've read a couple pages that referenced refreshing the DOM after the javascript runs, but I haven't been able to successfully do it (I know enough about javascript to hack things around and get stuff working, but not enough to create whole new things). It is worth noting that Silverlight doesn't work with Internet Explorer 8 or Safari (not sure if it's my code, mejs, or the browsers). Also, neither Silverlight nor Flash play mp3 or webm (again, not sure where the problem lies). Is there a way to dynamically load different types of files into mediaelement?

    Read the article

  • question about fgets

    - by user105033
    Is this safe to do? (does fgets terminate the buffer with null) or should I be setting the 20th byte to null after the call to fgets before i call clean. // strip new lines void clean(char *data) { while (*data) { if (*data == '\n' || *data == '\r') *data = '\0'; data++; } } // for this, assume that the file contains 1 line no longer than 19 bytes // buffer is freed elsewhere char *load_latest_info(char *file) { FILE *f; char *buffer = (char*) malloc(20); if (f = fopen(file, "r")) if (fgets(buffer, 20, f)) { clean(buffer); return buffer; } free(buffer); return NULL; }

    Read the article

< Previous Page | 178 179 180 181 182 183 184 185 186 187 188 189  | Next Page >