Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 107/1981 | < Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >

  • How can I search files on ubuntu?

    - by asdffdg
    How can I search files on ubuntu ??? The usual search tool does not find anything . I have installed tracker search tool and it too does not find anything .I tried to follow the instructions found to enable this tool by going to systempreferencessearching and indexing but Where the hell is systempreferencessearching and indexing? I found a program called searching and indexing but it does not contain anything that is described in the instructions .

    Read the article

  • where are apt-get files stored?

    - by loolooyyyy
    i have an ubuntu with three virtual machines running inside running ubuntu also i update the host using apt-get update but i can update VM's, it takes a long time + uses a lot of bandwith which i'm running out of,i want to transfer the updated files by apt-get to VMs, could you please tell me where are they? i'm not talking about the packages themselves stored in /var/cache/apt/archives, i want the file that stores list of available packages on mirror i have selected, thanks lot ps: i know this question has been asked somewhere but i cant find it!

    Read the article

  • SQL Server Semantic Search to Find Text in External Files

    Sometimes it is necessary to search for specific content inside documents stored in a SQL Server database. Is it possible to do this in SQL Server? Can I run T-SQL queries and find content inside Microsoft Word files? Yes, now with SQL Server 2012 you can do a semantic search. 12 essential tools for database professionalsThe SQL Developer Bundle contains 12 tools designed with the SQL Server developer and DBA in mind. Try it now.

    Read the article

  • MSDeploy migrate only configuration from IIS 6 to 7

    - by Matt
    We have 19 websites, many of which have many "virtual" directories, on a IIS 6 server. I am trying to migrate the whole thing over to IIS 7 on a different server. Following the instructions on http://learn.iis.net/page.aspx/427/migrate-a-web-site-from-iis-60-to-iis-70/ I attempted the following command line execution: msdeploy -verb:sync -source:metakey=lm/w3svc -dest:package=D:\Temp\Sites.zip > D:\temp\WebDeployPackage.log It appeared to be working until I got a "There is not enough space on the disk" error. The D drive, which you'll notice is the target location for the package and the log has plenty of space (all the resources for the websites are about 5gb, the drive has 200+). Though the C drive is of limited size (6Gb), so that might be the problem. Anyway, I figure the best bet was to try and do a migration of settings/configurations only, not the actual resources. We can easily deploy the resources to the new server with our NAnt build scripts, so that's not an issue. Getting all the correct configurations moved over, however, would be challenging to do manually. So, is there a way to export/package only the configuration/options of the IIS 6 server using msdeploy (or any other tool?)

    Read the article

  • Win7 playback of dvr-ms files stutters

    - by Jim Lynn
    I've just had to install Windows 7 on my Media Center machine because my Vista installation had a faulty drive. I've got the latest drivers that I can find - Intel 945GM integrated Graphics, Realtek audio drivers. Things are working OK with one exception. Playback of old recordings, from dvr-ms format files, is choppy. The picture freezes for a fraction of a second, then quickly catches up. The sound is uninterrupted and doesn't pause. These freezes happen once every 5 seconds or so. It's very regular. Playback of Live TV from the digital tuner is perfectly smooth. DVD playback is perfectly smooth. As an experiment, I used the MPEG editing package VideoReDo to create a small test file in three different formats. This program takes the raw MPEG streams and repackages them into the desired container. I took the same clip and created three files in three formats: dvr-ms (Microsoft's old recorded TV format); mpg (standard MPEG); and ts (raw MPEG transport stream of the kind often produced by PVRs). When these three files are played back under Windows 7, the mpg and ts files play smoothly, but the dvr-ms file stutters. The last piece of data I have is that two other Windows 7 machines can play back dvr-ms files smoothly with no stuttering. One is a netbook, with less grunt than the media centre. So there must be something specific about my Media Center machine that's causing the problem. Does anyone have any idea where I can look now? I don't know much about AV software, codecs, filter graphs etc. but I suspect that's where the problem lies. Rendering the video isn't the problem, but extracting the streams is. How would I go about diagnosing the problem? Edited to add: I just used the GraphStudio tool to look at the filter graph on the offending PC. The filter graph it uses by default for dvr-ms looks identical to the other machines, and, interestingly, when I play the files using GraphStudio they run smoothly. Under Windows Media Player and Windows Media Center they stutter. I'd like to see the filter graph for WMP but GraphStudio won't show it. It looks like WMP and WMC are using a different decoding path to GraphStudio. Edited again to add: Today I purchased a new HDTV. The same Media Center driving the TV at 1080p is now playing back the old Recorded TV files smoothly, without stuttering. So whatever the cause of the original problem, using a different resolution seems to have removed the problem. It might also explain why nobody else has had this problem. I doubt many people use Media Centre with a 14in portable TV.

    Read the article

  • How to use a App.config file in WPF applications?

    - by Edward Tanguay
    I created a App.config file in my WPF application: <?xml version="1.0" encoding="utf-8" ?> <configuration> <appsettings> <add key="xmlDataDirectory" value="c:\testdata"/> </appsettings> </configuration> Then I try to read the value out with this: string xmlDataDirectory = ConfigurationSettings.AppSettings.Get("xmlDataDirectory"); But it says this is obsolete and that I should use ConfigurationManager which I can't find, even searching in the class view. Does anyone know how to use config files like this in WPF?

    Read the article

  • Can someone provide an example of seeking, reading, and writing a >4GB file using boost iostreams

    - by Queueless
    I have read that boost iostreams supposedly supports 64 bit access to large files semi-portable way. Their FAQ mentions 64 bit offset functions, but there is no examples on how to use them. Has anyone used this library for handling large files? A simple example of opening two files, seeking to their middles, and copying one to the other would be very helpful. Thanks.

    Read the article

  • Uploadify works for Visual Studio but not for IIS 7(same machines), using Forms authentication. Doe

    - by Marc
    I'm using the Uploadify jQuery control for client-side uploads. I think my IIS 7 configuration has issues with it. The uploadify POST immediately returns a HTTP 1.1 302 Found, back to my login page. I've tried to allow anonymous access to the uploading section(subfolder) plus the page(script) that processes the image in the web.config, using the location node(configuration ... location). Seems like the Uploadify post is immediately blocked. Again, this worked fine just using Visual Studio 2008, but when I run the site on the same machine I get the redirect. Your thoughts/ideas are very welcomed!

    Read the article

  • WCF configuration and ISA Proxies

    - by Morten Louw Nielsen
    Hi, I have a setup with a .NET WCF Service hosted on IIS. The client apps are connecting to the service through a set of ISA proxy's. I don't know how many and don't know about their configuration etc. In the client apps I open a client to the service and make several calls via the same client. It works great in my office, but when I deploy at the customer (using the ISAs), after some calls, the connection breaks. In a successfull case, the client will maximum live a few seconds, but is that too much? I think there might be several proxyes. Maybe it's using load ballancing. pseudo code is something like this: WcfClient myClient = new WcfClient(); foreach (WorkItem Item in WorkItemsStack) myClient.ProcessItem(Item); myClient.Close(); I am thinking whether I have to do something like this foreach (WorkItem Item in WorkItemsStack) { WcfClient myClient = new WcfClient(); myClient.ProcessItem(Item); myClient.Close(); } Any one with experience with this field? Kind Regards, Morten, Denmark

    Read the article

  • Windows 7 playback of dvr-Microsoft files stutters

    - by Jim Lynn
    I've just had to install Windows 7 on my Media Center machine because my Vista installation had a faulty drive. I've got the latest drivers that I can find - Intel 945GM integrated Graphics, Realtek audio drivers. Things are working OK with one exception. Playback of old recordings, from dvr-Microsoft format files, is choppy. The picture freezes for a fraction of a second, then quickly catches up. The sound is uninterrupted and doesn't pause. These freezes happen once every 5 seconds or so. It's very regular. Playback of Live TV from the digital tuner is perfectly smooth. DVD playback is perfectly smooth. As an experiment, I used the MPEG editing package VideoReDo to create a small test file in three different formats. This program takes the raw MPEG streams and repackages them into the desired container. I took the same clip and created three files in three formats: dvr-Microsoft (Microsoft's old recorded TV format); mpg (standard MPEG); and ts (raw MPEG transport stream of the kind often produced by PVRs). When these three files are played back under Windows 7, the mpg and ts files play smoothly, but the dvr-Microsoft file stutters. The last piece of data I have is that two other Windows 7 machines can play back dvr-Microsoft files smoothly with no stuttering. One is a netbook, with less grunt than the media centre. So there must be something specific about my Media Center machine that's causing the problem. Does anyone have any idea where I can look now? I don't know much about AV software, codecs, filter graphs etc. but I suspect that's where the problem lies. Rendering the video isn't the problem, but extracting the streams is. How would I go about diagnosing the problem? Edited to add: I just used the GraphStudio tool to look at the filter graph on the offending PC. The filter graph it uses by default for dvr-Microsoft looks identical to the other machines, and, interestingly, when I play the files using GraphStudio they run smoothly. Under Windows Media Player and Windows Media Center they stutter. I'd like to see the filter graph for Windows Media Player but GraphStudio won't show it. It looks like Windows Media Player and WMC are using a different decoding path to GraphStudio. Edited again to add: Today I purchased a new HDTV. The same Media Center driving the TV at 1080p is now playing back the old Recorded TV files smoothly, without stuttering. So whatever the cause of the original problem, using a different resolution seems to have removed the problem. It might also explain why nobody else has had this problem. I doubt many people use Media Centre with a 14in portable TV.

    Read the article

  • FluentNHibernate SQLite configuration exception - after switching to .net4

    - by stiank81
    I get an exception thrown when trying to use Fluent to configure my NHibernate connection to SQLite. The code I use to configure is as follows: var cfg = Fluently.Configure(). Database(SQLiteConfiguration.Standard.ShowSql().UsingFile("MyDb.db")). Mappings(m => m.FluentMappings.AddFromAssemblyOf<MappingsPersistenceModel>()); _sessionFactory = cfg.BuildSessionFactory(); A HibernateException is thrown when BuildSessionFactory() is called, saying: Could not create the driver from NHibernate.Driver.SQLite20Driver, NHibernate, Version=2.1.2.4000, Culture=neutral, PublicKeyToken=aa95f207798dfdb4. It has an InnerException: Exception has been thrown by the target of an invocation. Which again has an InnerException: The IDbCommand and IDbConnection implementation in the assembly System.Data.SQLite could not be found. Ensure that the assembly System.Data.SQLite is located in the application directory or in the Global Assembly Cache. If the assembly is in the GAC, use element in the application configuration file to specify the full name of the assembly. Now - to me it sounds like it doesn't find System.Data.SQLite.dll, but I can't understand this. Everywhere this is referenced I have "Copy Local", and I have verified that it is in every build folder for projects using SQLite. I have also copied it manually to every Debug folder of the solution - without luck. What can be causing this? Any ideas? My suspicion is that it is related to .Net4 somehow. The reason is that it worked just fine when I used .Net3.5, and then I changed to .Net4, and the problem started. You can also check out this other question for a more general approach towards Fluent-.Net4 compatibility.

    Read the article

  • Dreamweaver - template not recognising child files until they are opened

    - by Chris
    Hi I've got a new section for a website which I have generated for a data source and it has markup for using a Dreamweaver template. When I add the new files and folders to the site , then update my template , it doesn't find the new files to update. If I open one of the new files , make a change in the template , then it recognises the new file is using the template. So it's almost like I have to touch all the files with Dreamweaver first. I've tried to open all the new files which need to use the template but then Dreamweaver CS4 crashes, I presume because of the number of files it's opening. Anyway, does anyone know if there is a way to make Dreamweaver recognise that a block of new files belong to the template , it doesn't seem to just work automatically Thanks Chris

    Read the article

  • Start exe even with missing dependency dlls?

    - by k3b
    In Dotnet2.0 and later a program refuses to start if one of its dependent (static referenced) dlls are missing. With Dotnet1.1 and 1.0 the program started but crashed later when trying to use functionality of the missing assembly. I wonder if there is something like a compiler switch , configuration option or a dotnet [attribute] to allow me to start the app when certain dlls are missing. Is it possible without moidfying the sourcecode (execpt by applying some Attriutes)? I don't want to manualy load assemblies by programcode or use IOC-Framworks. Update: With "static referenced dlls" i mean the opposite of dynamicly loading a dll in my own programcode using reflection and Assembly.Loadxxxx(). Update 2010-12-25: This scenario happens for example if you want to use Log4net with Dotnet4 clientprofile together with a WinForms-Aplication: Log4net requires System.Web.dll that is not in Dotnet4-Clientprofile. You must install dotnet4-web-support to use the winforms-aplication that is compiled against log4net unless there is some magic Compiler-switch/Attribute/Configuration that i am still looking for.

    Read the article

  • archiving (ubuntu tar) hidden directories

    - by broiyan
    tar on a directory "mydir" will archive hidden files and hidden subdirectories, but tar from within "mydir" with a wildcard will not. Is this a longstanding and known inconsistency or bug or is it that hardly anybody ever looks inside a lengthy tar log long enough to notice? Edit (additional information): tar from within "mydir" with a wildcard will not "see" nor archive hidden files and hidden subdirectories in the immediate directory, with emphasis on "immediate". However, in subdirectories of "mydir" (obviously non-hidden) hidden files and hidden subdirectories will be archived.

    Read the article

  • Control pdb file output from build defintion file

    - by Urvi
    Hello, I am trying to generate a release build with no pdb files generated. I have seen numerous posts that suggest right-clicking on the project, selecting Properties, going to the Build tab and then to the Advanced... butoon and changing Debug Info to none. This works and all, but I need to do this for a build of ~50 solutions which contain ~25 projects each! Other posts mention editing the appropriate .csproj file, but again, with so many projects, this would take a long time. Is there any way to achieve this via the TFSBuild.proj file? I have tried adding the following to the TFSBuild.proj file, with no luck. <PropertyGroup> <Configuration>Release</Configuration> <Platform>AnyCPU</Platform> </PropertyGroup> <PropertyGroup> <DebugSymbols>false</DebugSymbols> <DebugType>none</DebugType> <Optimize>true</Optimize> </PropertyGroup> The following line prints out Release|AnyCPU, none, and false, but I still see .pdb file in the $(OutputDir) folder. <Message Text="$Configuration|Platform): $(Configuration)|$(Platform)" /> <Message Text="DebugType is: $(DebugType)"/> <Message Text="DebugSymbols is: $(DebugSymbols)"/> Thanks in advance, Urvi

    Read the article

  • What is optimal hardware configuration for heavy load LAMP application

    - by Piotr Kochanski
    I need to run Linux-Apache-PHP-MySQL application (Moodle e-learning platform) for a large number of concurrent users - I am aiming 5000 users. By concurrent I mean that 5000 people should be able to work with the application at the same time. "Work" means not only do database reads but writes as well. The application is not very typical, since it is doing a lot of inserts/updates on the database, so caching techniques are not helping to much. We are using InnoDB storage engine. In addition application is not written with performance in mind. For instance one Apache thread usually occupies about 30-50 MB of RAM. I would be greatful for information what hardware is needed to build scalable configuration that is able to handle this kind of load. We are using right now two HP DLG 380 with two 4 core processors which are able to handle much lower load (typically 300-500 concurrent users). Is it reasonable to invest in this kind of boxes and build cluster using them or is it better to go with some more high-end hardware? I am particularly curious how many and how powerful servers are needed (number of processors/cores, size of RAM) what network equipment should be used (what kind of switches, network cards) any other hardware, like particular disc storage solutions, etc, that are needed Another thing is how to put together everything, that is what is the most optimal architecture. Clustering with MySQL is rather hard (people are complaining about MySQL Cluster, even here on Stackoverflow).

    Read the article

  • How to automate configurating DotNetNuke settings for several environments?

    - by Joosh21
    Are there any recommended methods for automating configuring DotnetNuke settings? We will have several instances of our DNN application (prod, beta, qa, dev, local, etc) and need to be able to configure them all the same and be able to make updates to them all with our future releases. The settings currently needed to be configured include Host Settings, Portal Settings and User Profile Definitions. Here are some approaches I have come up with so far: 1) Create a Configuration module and use SQL scripts for all the settings? Is it generally safe to manipulate the DNN tables directly? Often it is recommended to use APIs with many frameworks. 2) Create a Configuration module and implement IUpgradeable.UpgradeModule and programatically set the settings? 3) Create a PortalTemplate from a portal with the settings all set. I believe this will only work for creating new portals. I will not be able to update existing portals.

    Read the article

  • Problem with CruiseControl.net configuration

    - by Pawel
    Hi I started using ccnet to build my project. This is quite new issue for me so I have some problems. First thing: Why does ccnet copy directory with my project to another directory (ccnet creates new folder named the same as project name included in ccnet.config file and copies to them directory with my project) Second thing: Dashboard page cannot show reports for recent build (When I click on any item in recent build then I get page: "The page Cannot be found" I suppose that page cannot link files with logs. but I don't know how to link it. I create one publisher: <publishers> <xmllogger logDir="c:\Branches" /> Can anyone help me?

    Read the article

  • How to enable Windows Mobile 6 logs?

    - by Serge - appTranslator
    Hi All, Someone told me that one can enable Windows Mobile simply by creating a bunch of registry values as explained in this TechEd article. The article is in the scopê of MS System Configuration Manager but my client tells me that the logs can be created even without System Config Manager. I tried but I couldn't get the system to create any such log file. Can anyone explain how to get this logs (or why I can't get them)? TIA,

    Read the article

  • File Storage for Web Applications: Filesystem vs DB vs NoSQL engines

    - by El Yobo
    I have a web application that stores a lot of user generated files. Currently these are all stored on the server filesystem, which has several downsides for me. When we move "folders" (as defined by our application) we also have to move the files on disk (although this is more due to strange design decisions on the part of the original developers than a requirement of storing things on the filesystem). It's hard to write tests for file system actions; I have a mock filesystem class that logs actions like move, delete etc, without performing them, which more or less does the job, but I don't have 100% confidence in the tests. I will be adding some other jobs which need to access the files from other service to perform additional tasks (e.g. indexing in Solr, generating thumbnails, movie format conversion), so I need to get at the files remotely. Doing this over network shares seems dodgy... Dealing with permissions on the filesystem as sometimes given us problems in the past, although now that we've moved to a pure Linux environment this should be less of an issue. What are the downsides of storing files as BLOBs in MySQL? I guess that it would massively increase the database size and reduce the effectiveness of caches, but are there other problems? Do the same problems exist with NoSQL systems like Cassandra? Does anyone have any other suggestions that might be appropriate?

    Read the article

< Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >