Search Results

Search found 42367 results on 1695 pages for 'resource files'.

Page 239/1695 | < Previous Page | 235 236 237 238 239 240 241 242 243 244 245 246  | Next Page >

  • How to load image data from resource bitmap file for directshow filter ?

    - by Forrest
    I need put one bitmap image to my directshow filter. Then user can use this bitmap image and do not care where is it. First, I import this bitmap file into resource bundle, and get one IDB_BITMAP1. Then, I need to read this IDB_BITMAP1 using opencv cvLoadImage or some windows image API to load this image into buffer. So question is how to do this ? Or is that possible ? Thanks

    Read the article

  • how can I use internal resource (e.g. R.id.myImage01) as an email attachment?

    - by Azam
    how can I use internal resource (e.g. R.drawable.myImage01) as an email attachment? I tried the following code. It shows the file is attach but actually the email gets send without any attachment. Context context = getApplicationContext(); String imagePath = context.getFilesDir() + "/" + "myImage01.png"; emailIntent.putExtra(android.content.Intent.EXTRA_STREAM, Uri.parse(imagePath)); Thanks Azam

    Read the article

  • Simple WCF service with REST - Resource cannot be found - error with ASP.NET Debug Server?

    - by lesasch
    Hi, I'm testing a very simple WCF-Service with REST functionality enabled. It works fine on IIS, but the VS2010 debug webserver always says "The resource cannot be found" when appending the parameter after the .svc file in the browser uri. Is this a known issue with the asp.net debug webserver that it cannot work with REST or am I doing anything wrong? (again: with IIS, it works)

    Read the article

  • Threading across multiple files

    - by Zach M.
    My program is reading in files and using thread to compute the highest prime number, when I put a print statement into the getNum() function my numbers are printing out. However, it seems to just lag no matter how many threads I input. Each file has 1 million integers in it. Does anyone see something apparently wrong with my code? Basically the code is giving each thread 1000 integers to check before assigning a new thread. I am still a C noobie and am just learning the ropes of threading. My code is a mess right now because I have been switching things around constantly. #include <stdio.h> #include <stdlib.h> #include <time.h> #include <string.h> #include <pthread.h> #include <math.h> #include <semaphore.h> //Global variable declaration char *file1 = "primes1.txt"; char *file2 = "primes2.txt"; char *file3 = "primes3.txt"; char *file4 = "primes4.txt"; char *file5 = "primes5.txt"; char *file6 = "primes6.txt"; char *file7 = "primes7.txt"; char *file8 = "primes8.txt"; char *file9 = "primes9.txt"; char *file10 = "primes10.txt"; char **fn; //file name variable int numberOfThreads; int *highestPrime = NULL; int fileArrayNum = 0; int loop = 0; int currentFile = 0; sem_t semAccess; sem_t semAssign; int prime(int n)//check for prime number, return 1 for prime 0 for nonprime { int i; for(i = 2; i <= sqrt(n); i++) if(n % i == 0) return(0); return(1); } int getNum(FILE* file) { int number; char* tempS = malloc(20 *sizeof(char)); fgets(tempS, 20, file); tempS[strlen(tempS)-1] = '\0'; number = atoi(tempS); free(tempS);//free memory for later call return(number); } void* findPrimality(void *threadnum) //main thread function to find primes { int tNum = (int)threadnum; int checkNum; char *inUseFile = NULL; int x=1; FILE* file; while(currentFile < 10){ if(inUseFile == NULL){//inUseFIle being used to check if a file is still being read sem_wait(&semAccess);//critical section inUseFile = fn[currentFile]; sem_post(&semAssign); file = fopen(inUseFile, "r"); while(!feof(file)){ if(x % 1000 == 0 && tNum !=1){ //go for 1000 integers and then wait sem_wait(&semAssign); } checkNum = getNum(file); /* * * * * I think the issue is here * * * */ if(checkNum > highestPrime[tNum]){ if(prime(checkNum)){ highestPrime[tNum] = checkNum; } } x++; } fclose(file); inUseFile = NULL; } currentFile++; } } int main(int argc, char* argv[]) { if(argc != 2){ //checks for number of arguements being passed printf("To many ARGS\n"); return(-1); } else{//Sets thread cound to user input checking for correct number of threads numberOfThreads = atoi(argv[1]); if(numberOfThreads < 1 || numberOfThreads > 10){ printf("To many threads entered\n"); return(-1); } time_t preTime, postTime; //creating time variables int i; fn = malloc(10 * sizeof(char*)); //create file array and initialize fn[0] = file1; fn[1] = file2; fn[2] = file3; fn[3] = file4; fn[4] = file5; fn[5] = file6; fn[6] = file7; fn[7] = file8; fn[8] = file9; fn[9] = file10; sem_init(&semAccess, 0, 1); //initialize semaphores sem_init(&semAssign, 0, numberOfThreads); highestPrime = malloc(numberOfThreads * sizeof(int)); //create an array to store each threads highest number for(loop = 0; loop < numberOfThreads; loop++){//set initial values to 0 highestPrime[loop] = 0; } pthread_t calculationThread[numberOfThreads]; //thread to do the work preTime = time(NULL); //start the clock for(i = 0; i < numberOfThreads; i++){ pthread_create(&calculationThread[i], NULL, findPrimality, (void *)i); } for(i = 0; i < numberOfThreads; i++){ pthread_join(calculationThread[i], NULL); } for(i = 0; i < numberOfThreads; i++){ printf("this is a prime number: %d \n", highestPrime[i]); } postTime= time(NULL); printf("Wall time: %ld seconds\n", (long)(postTime - preTime)); } } Yes I am trying to find the highest number over all. So I have made some head way the last few hours, rescucturing the program as spudd said, currently I am getting a segmentation fault due to my use of structures, I am trying to save the largest individual primes in the struct while giving them the right indices. This is the revised code. So in short what the first thread is doing is creating all the threads and giving them access points to a very large integer array which they will go through and find prime numbers, I want to implement semaphores around the while loop so that while they are executing every 2000 lines or the end they update a global prime number. #include <stdio.h> #include <stdlib.h> #include <time.h> #include <string.h> #include <pthread.h> #include <math.h> #include <semaphore.h> //Global variable declaration char *file1 = "primes1.txt"; char *file2 = "primes2.txt"; char *file3 = "primes3.txt"; char *file4 = "primes4.txt"; char *file5 = "primes5.txt"; char *file6 = "primes6.txt"; char *file7 = "primes7.txt"; char *file8 = "primes8.txt"; char *file9 = "primes9.txt"; char *file10 = "primes10.txt"; int numberOfThreads; int entries[10000000]; int entryIndex = 0; int fileCount = 0; char** fileName; int largestPrimeNumber = 0; //Register functions int prime(int n); int getNum(FILE* file); void* findPrimality(void *threadNum); void* assign(void *num); typedef struct package{ int largestPrime; int startingIndex; int numberCount; }pack; //Beging main code block int main(int argc, char* argv[]) { if(argc != 2){ //checks for number of arguements being passed printf("To many threads!!\n"); return(-1); } else{ //Sets thread cound to user input checking for correct number of threads numberOfThreads = atoi(argv[1]); if(numberOfThreads < 1 || numberOfThreads > 10){ printf("To many threads entered\n"); return(-1); } int threadPointer[numberOfThreads]; //Pointer array to point to entries time_t preTime, postTime; //creating time variables int i; fileName = malloc(10 * sizeof(char*)); //create file array and initialize fileName[0] = file1; fileName[1] = file2; fileName[2] = file3; fileName[3] = file4; fileName[4] = file5; fileName[5] = file6; fileName[6] = file7; fileName[7] = file8; fileName[8] = file9; fileName[9] = file10; FILE* filereader; int currentNum; for(i = 0; i < 10; i++){ filereader = fopen(fileName[i], "r"); while(!feof(filereader)){ char* tempString = malloc(20 *sizeof(char)); fgets(tempString, 20, filereader); tempString[strlen(tempString)-1] = '\0'; entries[entryIndex] = atoi(tempString); entryIndex++; free(tempString); } } //sem_init(&semAccess, 0, 1); //initialize semaphores //sem_init(&semAssign, 0, numberOfThreads); time_t tPre, tPost; pthread_t coordinate; tPre = time(NULL); pthread_create(&coordinate, NULL, assign, (void**)numberOfThreads); pthread_join(coordinate, NULL); tPost = time(NULL); } } void* findPrime(void* pack_array) { pack* currentPack= pack_array; int lp = currentPack->largestPrime; int si = currentPack->startingIndex; int nc = currentPack->numberCount; int i; int j = 0; for(i = si; i < nc; i++){ while(j < 2000 || i == (nc-1)){ if(prime(entries[i])){ if(entries[i] > lp) lp = entries[i]; } j++; } } return (void*)currentPack; } void* assign(void* num) { int y = (int)num; int i; int count = 10000000/y; int finalCount = count + (10000000%y); int sIndex = 0; pack pack_array[(int)num]; pthread_t workers[numberOfThreads]; //thread to do the workers for(i = 0; i < y; i++){ if(i == (y-1)){ pack_array[i].largestPrime = 0; pack_array[i].startingIndex = sIndex; pack_array[i].numberCount = finalCount; } pack_array[i].largestPrime = 0; pack_array[i].startingIndex = sIndex; pack_array[i].numberCount = count; pthread_create(&workers[i], NULL, findPrime, (void *)&pack_array[i]); sIndex += count; } for(i = 0; i< y; i++) pthread_join(workers[i], NULL); } //Functions int prime(int n)//check for prime number, return 1 for prime 0 for nonprime { int i; for(i = 2; i <= sqrt(n); i++) if(n % i == 0) return(0); return(1); }

    Read the article

  • OS X Client & Ubuntu Server - Best way for client to access files on server?

    - by Camsoft
    I've got a local development web server running Ubuntu. I also have an iMac running OS X 10.6 which I use a client and is my development machine. I'm currently have Samba server installed on my Ubuntu server. I have shares setup for all the website directories. I then use my Mac and Coda to edit the files via their shares. This generally works really well but I noticed that my Mac was writing loads of resource fork ._filename files everywhere. I found out the following about the files: These files are created on volumes that don't natively support full HFS file characteristics (e.g. ufs volumes, Windows fileshares, etc). When a Mac file is copied to such a volume, its data fork is stored under the file's regular name, and the additional HFS information (resource fork, type & creator codes, etc) is stored in a second file (in AppleDouble format), with a name that starts with "._". (These files are, of course, invisible as far as OS-X is concerned, but not to other OS's; this can sometimes be annoying...) Does anyone know of a way of sharing files between a Mac client and a Linux server that is most compantable between the two operation systems? Ideally it needs to support the HFS filesystem so that the resource forks are not created and it also needs to support the permissions between server and client.

    Read the article

  • Mac OS X software always order files alphabetically rather than by type.

    - by george
    I have noticed many Mac applications sort the files alphabetically rather than by type. A good example would be Coda by panic.com. The files in the file menu are organized alphabetically. I requested for them to add the feature to organize files by type, and they've said that it's a Finder thing. So I looked at other applications to see if they were organizing by type. I noticed Dreamweaver CS4 had this same problem and now including Dreamweaver CS5. There has to be something in the Mac that does this and that I can modify. I played with Spotlight and it now displays its files by type (thinking that's what I can do) but it didn't take effect in other applications. What library are these applications using to display a file menu for their files? here is an example-- file menu layout of coda by panic.com. (i couldnt post another link because it wouldnt let me). can you see how everything is organised alphabetically rather than by folder? i just want the file menu to show all folders first then all the files. 1) http://www.iaddesign.com/coda.png there must be a way to modify mac to let me to do this.

    Read the article

  • How to deal with the extremely big *.ost files in a Terminal Server environment which is running out of space

    - by Wolfgang Kuehne
    Our Terminal Server is running out of hard disk space, and the major files which occupy most of the space are *.ost files of the Outook, which come form the users which use the Terminal Server all the time through remote desktop. The Outlook is installed on the Terminal Server and various users can use it. What would be a solution in this case. Is there a way to limit the size of the *.ost files? I read in forums that having the Outlook 2010 set up in Cached Exchange Mode isn't the best practice for an environment where the hdd space is a major constraint. First thing that came to my mind is using folder redirection, and place the ost files (together with the AppData forlder) in a network share, but this does not help, because the ost files are saved a part of the AppData folder which can not be redirected. Then I thought if it is possible to limit the size of the ost file? Or limit the time that it keeps emailed cached, say just emails from the last 6 months are sufficient. Another solutions that came to my mind, moving the ost files somewhere else, this required the old ost file to be removed, and creation of a new one. I am not quite sure if the new OST file will still have cached the emails which where available in the old ost, or will it start caching from where the other one left. What do you suggest?

    Read the article

  • How to make multiple Excel files open in ONE window/instance of Excel 2003 in Win 7

    - by Mark
    I'm running Excel 2003 on my new Windows 7 machine. (There is also a Excel 2010 starter pre installed that I do not use). I'm a heavy user of Excel. I use it all day every day. I often have 10 or 15 sheets open and once and many of them have cell references to each other. I also have a macro file that keeps all my short cuts. On my old W2K machine when I clicked on a .xls file or a shortcut to one to it would open that file in the existing instance of Excel. This is as it should be. I would have many files open, in only one "window" or instance of Excel. All the files could interact with each other, the cross file lookups worked, my macros worked and I could switch between workbooks with CTRL Tab or CTRL F6, I could move tabs from one workbook to another. On the new W7 machine clicking on an icon opens a NEW INSTANCE of Excel every time. This is terribly frustrating. None of my connecting spreadsheets work anymore. My macros don't work. I can't connect files, I can't move tabs. I'm stuck. I can't do my work! I can still open files in one instance by doing a CTRL-O and navigating, but I need to my files to work on a click. I'm guessing this is a flaw in the registry files, possibly because of the starter Excel 2010 that came preloaded on my new machine. Can you walk me through a registry edit to fix this bug? Is there an easier way than a registry edit?

    Read the article

  • Copy all installed programs & files in a hard disk (which has 32 bit Windows 7) and clone/transfer it to another computer which has 64 bit Windows 7

    - by galacticninja
    I recently got a new PC which has a 64-bit Windows 7 installed. The current PC that I am using has a 32-bit Windows 7 installed. I would like to know if there is a software that can copy all my installed programs and files in the hard disk with the 32-bit Windows 7 PC and transfer it to the newer PC's hard disk which has a 64 bit version of Windows 7. This is essentially like "cloning" a hard disk but I would like to use a 64-bit OS in the target drive, instead of also using the 32-bit OS of the source drive. I would like to do this I can avoid reinstalling and reconfiguring my installed programs and files again on the new PC. If possible, I would like the new PC to work as it was in my previous PC, with the installed programs, configuration and files intact except that the OS is now 64-bit and the hard disk has a larger capacity. I have heard of programs that can clone a hard disk, but my concern is that the 32-bit Windows 7 OS will also be cloned to the new 64-bit PC. If it is not possible to transfer my installed programs and settings like the way I described, are there software that can make it easier to migrate my installed programs, their configurations and my files from a 32-bit Windows 7 PC to a 64-bit Windows 7 PC? Details: I have a SATA to USB connector/adapter to copy files in the current hard disk to the newer one. The two PCs are connected through LAN, so I can also transfer files through LAN. Both PCs only have one hard disk.

    Read the article

  • Web Hosting: Any web host that supports files more than 50,000 in number?

    - by Devner
    Hi all, For my PHP & mySQL based application, I am trying to buy website hosting from a host who does not have a limit on the number of files I carry in my hosting account. Almost all the websites have a common limit of 50,000 files (some websites call it 50,000 nodes). The rest(to the extent of my search) are not even close. I have gone through the various websites, Googled lot of information, have spoken with the customer service of the hosting companies and they said that they have a limit of 50,000 files and that's why they call it the LIMIT. Now I have my application, which is a kind of social networking website, where people can upload various files of varying file size. So say if 50,000 users were to join the website and upload 1 file each, the limit of 50,000 will be reached very easily and my 50,001 customer will start facing file upload problems (& so will my account). So I would like to know if there's any website hosting services that do NOT levy such restrictions. In summary, I need the following options: No maximum file limit (more than 50,000 files in account). No maximum file upload limit in server setting (10MB, 12MB, 15MB, 20MB, etc.). Ability to upload files of various types (zip, flv, jg, png, etc.). Ability to stream Audio and Video (live audio & video not necessary). Access to .htaccess Access to php.ini, my.cnf or my.ini (this would be a plus) Supports SSL. Provides dedicated hosting(& IP) as well. Monthly payments without contracts are a plus. If you know of any such website hosting services, please post a reply ( a link to the same will be appreciated ). Thank you.

    Read the article

  • How can I move linked Word/Excel files without breaking the links under Windows 7?

    - by DOUG NEEDHAM
    I currently operate under Windows XP and have multiple links between my Word and Excel files. I have to upgrade to Windows 7. When the .doc and .xls files are converted to .docm and .xlsm, respectively, the links no longer work. The Word document is still attempting to point back to the old .xls file rather than the new file. Also, creating new links between Word and Excel within Office 2010 doesn't seem to work. I create the new link, switch it from "Auto" to "Manual" and everything works fine. But when I copy the files to another folder, the Word document is still trying to link to the file in the previous folder rather than the new folder. This always worked in Windows XP. I've been using linked Word/Excel documents for 10+ years and have never really had a problem. I'm very careful to maintain Word and Excel filenames when moving the files to a new folder. The process has always been to 1.) move the files, 2.) update the links, 3.) rename the files, and 4.) update the links again. It's my understanding that under Windows XP, links between Word and Excel are relative. But under Windows 7 (and Office 2010?), those same links become fixed.

    Read the article

  • How to convert JPEG JFIF files to JPEG Exif format?

    - by tigrou
    I recently put the SD card of my camera in a Windows 7 PC and start browsing pictures on it. I noticed some were not aligned correctly and use rotate feature included in Windows Photo Viewer in order to view them as I wanted. What I didn't know is that when rotate feature is used, it also overwrite the picture when pressing next or previous button resulting in a possible loss of quality (which is in my opinion a bad idea, app should at least warn user of what will happened when using such a feature). After that, I re-inserted the SD card back in my camera and bad surprise happened : the rotated picture could not be previewed anymore. Instead, i got a black screen saying "Incompatible JPEG format". Other files (untouched) are still working ok. To try to understand what happened I opened a JPEG file from camera and one generated on windows 7 in a hex editor. Here is the difference : The camera JPEG files have a Exif tag in them (with 0xE1 in header). Other JPEG files (Windows 7) have first a JFIF tag in it, followed by a Exif tag (with 0xE0 in header). So if i understand it well, both are JPEG files, but using a different internal format. Here is my question : is it possible (using some tool) to convert JFIF files to Exif format ? I understand that original camera files have been reencoded and thus lose some quality (getting originals back is impossible). What i want know if convert them from JFIF back to Exif (without a second loss of quality if possible...)

    Read the article

  • Where did my backup files go? Can they be recovered?

    - by Ken
    I just purchased a Western Digital Essential SE 1TB external hard drive from Best Buy at their recommendation. I then exchanged it for a Toshiba Canvio (I think that was the name). I have a Toshiba Qosmio X505-Q898. The Canvio locked up my computer and rewrote some kind of OS file, and erased all the restore points as well as the system image backup (according to Best Buy) just by plugging it in for the first time. Never even got to the install part or anything -- plugged it in and fried my computer. They spent about an hour and a half on my computer and got it back to a somewhat working condition and gave me access to my files. So now they say I have to back it up using my recovery disk and rewriting my OS. Enter the Essential. Brought it home last night, plugged it in and installed everything. Works perfect, no problems. Backed up everything on it. I unplugged and plugged it twice to make sure that everything was on it. Essential told me it had both the HDD and SSD backed up. So I reinstalled my OS. Plugged the Essential in and everything loads right up. Went to retrieve my files and the Western Digital has nothing on it. It shows all my music, pics, ETC. as still being on my computer and needing to be backed up, but since there are no files on my computer now. Where is this information coming from and where did my files go? It's about 810GB worth of files I've amassed over several years. Is there any way to recover data from this? I plan to contact Western Digital and Best Buy, just thought I would check here too. Any advice will be appreciated as a lot of these files are invaluable to me.

    Read the article

  • How do you backup your own files? [on hold]

    - by Antonis Christofides
    I'm a system administrator and I use rsnapshot to backup some servers, duplicity for some others. Both work fine, each one with advantages and disadvantages. Despite that, I am at a loss on how to backup my own private files. I'd use duplicity to automatically backup my files to a remote server; but the problem is that once in a while I must do a full backup. My emails and important files are 9G, and I expect this to increase. Uploading through aDSL at 1Mbit would be 20 hours. Too much. rsnapshot doesn't require periodic full backups (only the first time), but it must be running on the remote server and have a means to connect to my computer; if the server is compromised (or simply if the NSA decides to use it), my own machine is also compromised. Not good. The only solution I've come up with is use encfs, use unison to synchronize the files to a remote server, and use duplicity or rsnapshot on the remote server to backup these files. In that case, the question is whether I can sync the files on many computers; is it possible for encfs to be used with the same key on many computers? I also think that if I append one character to the unencrypted file, its encrypted encfs counterpart might change a lot, so that incrementals with duplicity would be less efficient—but not a big deal. Maybe also, when I need to restore a file, finding the correct file to restore could be a pain, because of filename encryption. I wonder whether there is any other possibility that I've overlooked. Maybe I'm asking too much for my personal use, and I should settle with an external disk?

    Read the article

  • Windows 7 Explorer: how to show total size of all files in current folder?

    - by matt wilkie
    In Windows XP Explorer one can turn on Status Bar which shows, among other things, the total size of all the files in the current folder, or if the cumulative size of the selected files. How do I get the same at-a-glance information in Windows 7? Selecting files doesn't count as it stops after 15 files, and it's rare that I'm concerned about total size with that few files (it's pretty easy to estimate in my head). thanks. UPDATE: Information derived from the context menu (select r-click properties) isn't "at a glance", and not as smooth as selecting files and clicking the details link at the bottom in any case. Thank you for fleshing out more of the available routes though. Yes Q19232 is similar to this one, though it is not a duplicate. That question is about looking for easy free-space on disk stats and this one is easy used-space by contents of this folder stats. The answer for both is the same though. You can't! Hopefully someone will figure how to get this lost feature back with a shell extension or something.

    Read the article

  • C#/.Net: What are the best practices for setting ResXResourceReader's basepath?

    - by raj.tiwari
    I am working on a Windows Forms application in C#/.Net. I want to use a resource file that contains translations of my strings. My my project in visual studio I have the following hierarchy: Project CS files ... Resources\ resource.en-US.resx I am trying to read in the resource file as follows: m_ResourceReader = new ResXResourceReader("resources/resource.en-US.resx"); When I run this project, Visual Studio seems to look for the resources folder in the bin/Debug output folder of my project. My questions are: What is the right way to reference a resource file? I would like my installer to place this resource file under my application's folder under Program Files\MyApp\resources\resource.en-US.resx. What would be the way to make ResXResourceReader read it from that location. Thanks for your help. -Raj

    Read the article

< Previous Page | 235 236 237 238 239 240 241 242 243 244 245 246  | Next Page >