Search Results

Search found 9980 results on 400 pages for 'kernel extension'.

Page 292/400 | < Previous Page | 288 289 290 291 292 293 294 295 296 297 298 299  | Next Page >

  • how to open a .pdf file in a panel or iframe using asp.net c#

    - by rahul
    I am trying to open a .pdf file on a button click. I want to open a .pdf file into a panel or some iframe. With the following code i can only open .pdf file in a separate window or in a save as mode. string filepath = Server.MapPath("News.pdf"); FileInfo file = new FileInfo(filepath); if (file.Exists) { Response.ClearContent(); Response.AddHeader("Content-Disposition", "inline; filename=" + file.Name); Response.AddHeader("Content-Length", file.Length.ToString()); Response.ContentType = ReturnExtension(file.Extension.ToLower()); Response.TransmitFile(file.FullName); Response.End(); } how to assign a iframe to the below line Response.AddHeader("Content-Disposition", "inline; filename=" + file.Name);

    Read the article

  • SVM in OpenCV: Visual Studio 2008 reported error wrongly (or is it right?)

    - by Risa
    I'm using MS Visual Studio 2008, OpenCV, C++ and SVM for a OCR-related project. At least I can run the code until yesterday, when I open the project to continue working, VS reported this error: error C2664: 'bool CvSVM::train(const CvMat *,const CvMat *,const CvMat *,const CvMat *,CvSVMParams)' : cannot convert parameter 1 from 'cv::Mat' to 'const CvMat *' It didn't happen before and I haven't changed any code relating to it (I only changed the parameters for the kernel). The code got error is: Mat curTrainData, curTrainLabel; CvSVM svm; . . . svm.train(curTrainData, curTrainLabel, Mat(), Mat(), params); If I hover on the code, I still got this tip: image. Which means my syntax isn't wrong. So why do VS bother to report such an error?

    Read the article

  • Delphi-5 single-file storage solution?

    - by pastacool
    Hi! Is there a Delphi-5 solution to easily integrate single-file storage into existing code? I would like to have files like Java *.jar or Openoffice document files which are zipped/compressed files and folders but with their own file extension. Edit: I know some ZIP capable components but in a nutshell I want to access files within the "container" and use normal file handling routines on them (eg. TStringList.SaveToFile). Any overhead about compress/uncompress should be handled by the component.

    Read the article

  • WPF: Soft deletes and binding?

    - by aks
    I have custom objects which implement INotifyProperyChanged and now I'm wondering if it is possible to implement soft delete which would play nicely with binding? Each object would have a IsDeleted property and if this property would be set to true than it would not be displayed in GUI. I was thinking about making a custom markup extension which would decorate Binding class but it hadn't worked out as expected. Now I'm considering using MultiBinding with IsDeleted as one of bound properties so that converter would be able to figure out which object is deleted. But this solution sounds quite complicated and boring. Does anybody have an idea how to implement soft deletes for binding?

    Read the article

  • Extract files from ZIP file with VBScript

    - by Tester101
    When extracting files from a ZIP file I was using the following. Sub Unzip(strFile) ' This routine unzips a file. NOTE: The files are extracted to a folder ' ' in the same location using the name of the file minus the extension. ' ' EX. C:\Test.zip will be extracted to C:\Test ' 'strFile (String) = Full path and filename of the file to be unzipped. ' Dim arrFile arrFile = Split(strFile, ".") Set fso = CreateObject("Scripting.FileSystemObject") fso.CreateFolder(arrFile(0) & "\ ") pathToZipFile= arrFile(0) & ".zip" extractTo= arrFile(0) & "\ " set objShell = CreateObject("Shell.Application") set filesInzip=objShell.NameSpace(pathToZipFile).items objShell.NameSpace(extractTo).CopyHere(filesInzip) fso.DeleteFile pathToZipFile, True Set fso = Nothing Set objShell = Nothing End Sub 'Unzip This was working, but now I get a "The File Exists" Error. What is the reason for this? Are there any alternatives?

    Read the article

  • Compile rt73 driver (serialmonkey) on Ubuntu

    - by Curro
    Hello. I'm trying to compile some WiFi drivers I downloaded from SerialMonkey: http://rt2x00.serialmonkey.com/wiki/index.php/Downloads It's about a driver for a Ralink rt73 chipset. I already got the build-essentials and the kernel-headers on my computer. When I try to compile the driver using the "make" command I get this error: root@curro-ubuntu:/usr/src/rt73-cvs-2009041204/Module# sudo make make[1]: Entering directory `/usr/src/linux-headers-2.6.31-20-generic' Building modules, stage 2. MODPOST 0 modules make[1]: Leaving directory `/usr/src/linux-headers-2.6.31-20-generic' rt73.ko failed to build! make: *** [module] Error 1 I've looked over the Internet and haven't found anything yet, I don't know why am I getting this error. I have Ubuntu 9.10 installed on this computer. I've seen some other persons having the same issue Any help would be greatly appreciated. Thanks in advance

    Read the article

  • Issue converting Sitecore Item[] using ToList<T>

    - by philba888
    Working with Sitecore and Linq extensions. I am trying to convert to from an item array to the list using the following piece of code: Item variationsFolder = masterDB.SelectSingleItem(VariationsFolderID.ToString()); List<Item> variationList = variationsFolder.GetChildren().ToList<Item>(); However I keep getting this error whenever I try to build: 'Sitecore.Collections.ChildList' does not contain a definition for 'ToList' and the best extension method overload 'System.Linq.Enumerable.ToList<TSource>(System.Collections.Generic.IEnumerable<TSource>)' has some invalid arguments I have the following usings: using System.Linq; using System.Xml.Linq; Am referencing: System.Core I've just copied this code from another location, so it should work fine, can only think that there is something simple (like a reference or something that I am missing).

    Read the article

  • Matplotlib and WSGI/mod_python not working on Apache.

    - by Luiz C.
    Everything works as supposed to on the Django development server. In Apache, the django app also works except when matplotlib is used. Here's the error I get: No module named multiarray. Exception Type: ImportError Exception Value: No module named multiarray Exception Location: /usr/share/pyshared/numpy/core/numerictypes.py in <module>, line 81 Python Executable: /usr/bin/python Python Version: 2.6.4 From the python shell, both statements work: import numpy.core.multiarray and import multiarray. Any ideas? Thanks As I'm looking over the numpy files, I found the multiarray module, which has an extension of 'so'. My guess, is that mod_python is not reading these files.

    Read the article

  • Git on windows :|

    - by Sonic Soul
    i've been experimenting with git as my personal code rep.. and it has been a bit of a disaster with windows. i've used Subversion, CVS, and Perforce in the past.. none were as annoying to use as git. i've figured out the PGP part (for github), although my workstation no longer lets me check in, and after searching around it turns out that git bash is using putty which is not that reliable and should be configured with something else.. i was not able to configure it with windows shell extension for a nice visual of what is part of the repository, what is modified, and easy check ins, and easy pushes.. has anyone successfully configured some kind of windows shell client and can efficiently and quickly synchronize various machines? It just seems to be more pain to use than it is worth..

    Read the article

  • Installing Tangible T4 Editor in VS 2010 RC

    - by Stomp
    I installed the (free) Tangible T4 Editor from the VS Gallery, and shut down/restarted VS. I run on XP under an Administrator account. VS Extension Manager indicates that the editor is installed and enabled. Here is my problem: When I do Add New Item (as in the video here), I don't see any of the Tangible templates. What have I missed? (BTW, I looked around some more, found the T4 Toolbox, installed that, and still see no new templates for Tangible or the toolbox). Thanks!

    Read the article

  • Options for Linux OS executable archive files - self installers

    - by Matt1776
    I am looking to create a web-project that is able to install with a program. The user should be able to download an archive file or tar file, run it (executable), and the setup script would ask for paths and configurable values and then unpack its 'payload' and sorting out the contents for deployment. This would be a Linux version of the MSI installer. Is there such a thing for Linux operating systems? This does not involve kernel level manipulations. All it needs to do is copy directories and files on the filesystem, which should cover about 80% if not more of all the *nix distributions.

    Read the article

  • Is it possible to generate dynamic proxy for static class or static method in C#?

    - by Jeffrey
    I am trying to come up with a way that (either static or instance) method calls can be intercepted by dynamic proxy. I want to implement it as c# extension methods but stuck on how to generate dynamic proxy for static methods. Some usages: Repository.GetAll<T>().CacheForMinutes(10); Repository.GetAll<T>().LogWhenErrorOccurs(); //or var repo = new Repository(); repo.GetAll<T>().CacheForMinutes(10); repo.GetAll<T>().LogWhenErrorOccurs(); I am open to any library (linfu, castle.dynamic proxy 2 or etc). Thanks!

    Read the article

  • How to track changes in many MSSQL databases from .NET application?

    - by Yauheni Sivukha
    Problem: There are a lot of different databases, which is populated by many different applications directly (without any common application layer). Data can be accessed only through SP (by policy) Task: Application needs to track changes in these databases and react in minimal time. Possible solutions: 1) Create trigger for each table in each database, which will populate one table with events. Application will watch this table through SqlDependency. 2) Watch each table in each database through SqlDependency. 3) Create trigger for each table in each database, which will notify application using managed extension. Which is the best way?

    Read the article

  • Is it possible to add -pedantic to GCC command line, yet have it not warn about 'long long'

    - by doublep
    I'm using mostly GCC to develop my library, but I'd like to ensure cross-compiler compatibility and especially standard conformance as much as possible. For this, I have add several -W... flags to command line. I'd also add -pedantic, but I have a problem with its warning about long long type. The latter is important for my library and is properly guarded with #if code, i.e. is not compiled on compilers that don't know it anyway. In short: can I have GCC in -pedantic mode warn about any extension except long long?

    Read the article

  • Cannot install Curb gem on Netbeans 6.9 R2

    - by Zeck
    Hi guys, I'm trying to install feedzirra. When i install curb on my netbeans i get following errors. Building native extensions. This could take a while... ERROR: Error installing curb-0.7.6.gem: ERROR: Failed to build gem native extension. "C:/Program Files/NetBeans 6.9 RC2/ruby/jruby-1.5.0/bin/jruby.bat.exe" extconf.rb '"C:/Program Files/NetBeans 6.9 RC2/ruby/jruby-1.5.0/bin/jruby.bat.exe"' is not recognized as an internal or external command, operable program or batch file. Gem files will remain installed in C:/Program Files/NetBeans 6.9 RC2/ruby/jruby-1.5.0/lib/ruby/gems/1.8/gems/curb-0.7.6 for inspection. Results logged to C:/Program Files/NetBeans 6.9 RC2/ruby/jruby-1.5.0/lib/ruby/gems/1.8/gems/curb-0.7.6/ext/gem_make.out Any ideas how I can get this to work? Or did you know very fast feed parser gem?

    Read the article

  • Enabling PostgreSQL support in PHP on Mac OS X

    - by Jordan Scales
    I'm having a terribly difficult time getting the command "pg_connect()" to work properly on my Mac. I'm currently writing a PHP script (to be executed from console) to read a PostgreSQL database and email a report. I've gone into my php.ini file and added extension=pgsql.so But, I'm met with the following error. PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php/extensions/no-debug-non-zts-20090626/php_pgsql.so' - dlopen(/usr/lib/php/extensions/no-debug-non-zts-20090626/php_pgsql.so, 9): image not found in Unknown on line 0 PHP Fatal error: Call to undefined function pg_connect() in... (blah file here) When running phpinfo(), I see nothing about PostgreSQL, so what is my issue here?

    Read the article

  • Using transactions with ADO.NET Data Adapters.

    - by Ergwun
    Scenario: I want to let multiple (2 to 20, probably) server applications use a single database using ADO.NET. I want individual applications to be able to take ownership of sets of records in the database, hold them in memory (for speed) in DataSets, respond to client requests on the data, perform updates, and prevent other applications from updating those records until ownership has been relinquished. I'm new to ADO.NET, but it seems like this should be possible using transactions with Data Adapters (ADO.NET disconnected layer). Question part 1: Is that the right way to try and do this? Question part 2: If that is the right way, can anyone point me at any tutorials or examples of this kind of approach (in C#)? Question part 3: If I want to be able to take ownership of individual records and release them independently, am I going to need a separate transaction for each record, and by extension a separate DataAdapter and DataSet to hold each record, or is there a better way to do that? Each application will likely hold ownership of thousands of records simultaneously.

    Read the article

  • PyML 0.7.2 - How to prevent accuracy from dropping after storing/loading a classifier?

    - by Michael Aaron Safyan
    This is a followup from "Save PyML.classifiers.multi.OneAgainstRest(SVM()) object?". The solution to that question was close, but not quite right, (the SparseDataSet is broken, so attempting to save/load with that dataset container type will fail, no matter what. Also, PyML is inconsistent in terms of whether labels should be numbers or strings... it turns out that the oneAgainstRest function is actually not good enough, because the labels need to be strings and simultaneously convertible to floats, because there are places where it is assumed to be a string and elsewhere converted to float) and so after a great deal of hacking and such I was finally able to figure out a way to save and load my multi-class classifier without it blowing up with an error.... however, although it is no longer giving me an error message, it is still not quite right as the accuracy of the classifier drops significantly when it is saved and then reloaded (so I'm still missing a piece of the puzzle). I am currently using the following custom mutli-class classifier for training, saving, and loading: class SVM(object): def __init__(self,features_or_filename,labels=None,kernel=None): if isinstance(features_or_filename,str): filename=features_or_filename; if labels!=None: raise ValueError,"Labels must be None if loading from a file."; with open(os.path.join(filename,"uniquelabels.list"),"rb") as uniquelabelsfile: self.uniquelabels=sorted(list(set(pickle.load(uniquelabelsfile)))); self.labeltoindex={}; for idx,label in enumerate(self.uniquelabels): self.labeltoindex[label]=idx; self.classifiers=[]; for classidx, classname in enumerate(self.uniquelabels): self.classifiers.append(PyML.classifiers.svm.loadSVM(os.path.join(filename,str(classname)+".pyml.svm"),datasetClass = PyML.VectorDataSet)); else: features=features_or_filename; if labels==None: raise ValueError,"Labels must not be None when training."; self.uniquelabels=sorted(list(set(labels))); self.labeltoindex={}; for idx,label in enumerate(self.uniquelabels): self.labeltoindex[label]=idx; points = [[float(xij) for xij in xi] for xi in features]; self.classifiers=[PyML.SVM(kernel) for label in self.uniquelabels]; for i in xrange(len(self.uniquelabels)): currentlabel=self.uniquelabels[i]; currentlabels=['+1' if k==currentlabel else '-1' for k in labels]; currentdataset=PyML.VectorDataSet(points,L=currentlabels,positiveClass='+1'); self.classifiers[i].train(currentdataset,saveSpace=False); def accuracy(self,pts,labels): logger=logging.getLogger("ml"); correct=0; total=0; classindexes=[self.labeltoindex[label] for label in labels]; h=self.hypotheses(pts); for idx in xrange(len(pts)): if h[idx]==classindexes[idx]: logger.info("RIGHT: Actual \"%s\" == Predicted \"%s\"" %(self.uniquelabels[ classindexes[idx] ], self.uniquelabels[ h[idx] ])); correct+=1; else: logger.info("WRONG: Actual \"%s\" != Predicted \"%s\"" %(self.uniquelabels[ classindexes[idx] ], self.uniquelabels[ h[idx] ])) total+=1; return float(correct)/float(total); def prediction(self,pt): h=self.hypothesis(pt); if h!=None: return self.uniquelabels[h]; return h; def predictions(self,pts): h=self.hypotheses(self,pts); return [self.uniquelabels[x] if x!=None else None for x in h]; def hypothesis(self,pt): bestvalue=None; bestclass=None; dataset=PyML.VectorDataSet([pt]); for classidx, classifier in enumerate(self.classifiers): val=classifier.decisionFunc(dataset,0); if (bestvalue==None) or (val>bestvalue): bestvalue=val; bestclass=classidx; return bestclass; def hypotheses(self,pts): bestvalues=[None for pt in pts]; bestclasses=[None for pt in pts]; dataset=PyML.VectorDataSet(pts); for classidx, classifier in enumerate(self.classifiers): for ptidx in xrange(len(pts)): val=classifier.decisionFunc(dataset,ptidx); if (bestvalues[ptidx]==None) or (val>bestvalues[ptidx]): bestvalues[ptidx]=val; bestclasses[ptidx]=classidx; return bestclasses; def save(self,filename): if not os.path.exists(filename): os.makedirs(filename); with open(os.path.join(filename,"uniquelabels.list"),"wb") as uniquelabelsfile: pickle.dump(self.uniquelabels,uniquelabelsfile,pickle.HIGHEST_PROTOCOL); for classidx, classname in enumerate(self.uniquelabels): self.classifiers[classidx].save(os.path.join(filename,str(classname)+".pyml.svm")); I am using the latest version of PyML (0.7.2, although PyML.__version__ is 0.7.0). When I construct the classifier with a training dataset, the reported accuracy is ~0.87. When I then save it and reload it, the accuracy is less than 0.001. So, there is something here that I am clearly not persisting correctly, although what that may be is completely non-obvious to me. Would you happen to know what that is?

    Read the article

  • PHP-GD: Dealing with Unicode characters

    - by sehugg
    I am developing a web service that renders characters using the PHP GD extension, using a user-selected TTF font. This works fine in ASCII-land, but there are a few problems: The string to be rendered comes in as UTF-8. I would like to limit the list of user-selectable fonts to be only those which can render the string properly, as some fonts only have glyphs for ASCII characters, ISO 8601, etc. In the case where some decorative characters are included, it would be fine to render the majority of characters in the selected font and render the decorative characters in Arial (or whatever font contains the extended glyphs). It does not seem like PHP-GD has support for querying the font metadata sufficiently to figure out if a character can be rendered in a given font. What is a good way to get font metrics into PHP? Is there a command-line utility that can dump in XML or other parsable format?

    Read the article

  • .net dictionary and lookup add / update

    - by freddy smith
    I am sick of doing blocks of code like this for various bits of code I have: if (dict.ContainsKey[key]) { dict[key] = value; } else { dict.Add(key,value); } and for lookups (i.e. key - list of value) if (lookup.ContainsKey[key]) { lookup[key].Add(value); } else { lookup.Add(new List<valuetype>); lookup[key].Add(value); } Is there another collections lib or extension method I should use to do this in one line of code no matter what the key and value types are? e.g. dict.AddOrUpdate(key,value) lookup.AddOrUpdate(key,value)

    Read the article

  • JAXB, Netbeans and Interface Insertion Plugin

    - by segolas
    Hi, I can't get my generated classes to implements any interface. This is my xml schema file: xmlns:jxb="http://java.sun.com/xml/ns/jaxb/" xmlns:ai="http://jaxb.dev.java.net/plugin/if_insertion" jxb:extensionBindingPrefixes="ai" <xs:element name="header"> <xs:annotation> <xs:appinfo> <ai:interfaces check="1"> utility.RuleInterface </ai:interfaces> </xs:appinfo> </xs:annotation> <xs:complexType> bla bla bla </xs:complexType> I checked the "Extension" option in the JAXB options and I hav added the xjc-if-ins.jar to the "Libraries" section of my project Properties. But the generated Header class doesn't implements the utility.RuleInterface. I can figure out what am I doing wrong... Is it something missing? Any idea? regards, Segolas

    Read the article

  • Maven: How to create assembly with snapshot artifacts without timestamps file name?

    - by marabol
    I've a repository containing snapshot artifacts with timestamps. I want to create an assembly, that contains the dependencies. This works fine. But the artifact names contains the timestamp. So i wonder how to remove the timestamp from filename for the assembly only. I've used this dependencySet: <outputFileNameMapping>${artifact.artifactId}-${artifact.version}.${artifact.extension}</outputFileNameMapping> But version seams to contain already the timestamp. So is there any chance to get a 1.1.1-SNAPSHOT instead of 1.1.1-20100323.071348-182?

    Read the article

  • How to write syntax highlighting?

    - by ML
    I am embarking on some learning and I want to write my own syntax highlighting for files in C++. Can anyone give me ideas on how to go about doing this? To me it seems that when a file is opened: 1. it would need to be parsed and decided what type of source file it is. Trusting the extension might not be full-proof a way to know what keywords/commands apply to what language a way to decide what color each keyword/command gets I want to do this on OS X, C++ or Objective-C Can anyone provide pointers on how I might get started with this?

    Read the article

  • How can I set MAXLENGTH on input fields automatically?

    - by Gary McGill
    I'm using ASP.NET MVC RC2. I have a set of classes that were auto-generated "Linq-to-SQL Classes", where each class models a table in my database. I have some default "edit" views that use the neat-o Html.TextBoxFor helper extension to render HTML input fields for the columns in a table. However, I notice that there is no MAXLENGTH attribute specified in the generated HTML. A quick google shows that you can add explicit attributes using TextBoxFor etc. but that would mean that I need to hard-code the values (and I need to remember to do it for every input). This seems pretty poor, considering that everything was automatically generated directly from the DB, so the size of the columns is known. (Although, to be fair, the web-side stuff has to work with any model objects, not just with Linq-to-SQL objects). Is there any nice way of getting this to do the right thing?

    Read the article

  • PHP - Opening uploaded DOCX files with the correct MIME TYPE

    - by user270797
    I have users uploading DOCX files which I make available for download. The issues we have been experiencing is the unknown mime types of DOCX files which causes IE to open these docs as Zip files. It is running on a Windows/IIS server. Because this is a shared host, I cannot change any server settings. I was thinking that I could just write some code that would handle DOCX files, perhaps custom output: if (extension=docx) { header("Content-Disposition: attachment; etc) header('Content-Type: application/application/vnd.openxmlformats-officedocument.wordprocessingml.document'); Output the file contents etc } Would this be a viable solution?? If so, can someone help fill in the gaps? (PS I know the above syntax is not correct, just a quick example)

    Read the article

< Previous Page | 288 289 290 291 292 293 294 295 296 297 298 299  | Next Page >