Search Results

Search found 45804 results on 1833 pages for 'large files'.

Page 358/1833 | < Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >

  • Trying to migrate Windows 7 install of Adobe CS5 to Ubuntu 12.04 with Wine - 'Internal errors - invalid paramters received"

    - by Don
    I have Adobe CS5 installed and running on the Windows 7 side of my machine. Since I'd hate to boot up into Windows just to use Photoshop, I'm trying to get it in Ubuntu 12.04. Tutorials I found suggested that the easiest way to have it in Ubuntu was to install Wine, and copy my Windows installation over. Here are the exact steps I've done up to this point. From Windows, exported the registry key for HKEY_LOCAL_MACHINE SOFTWARE Adobe to the desktop. Changed to Ubuntu, downloaded Wine from Software Center Terminal: $ sudo apt-get install wine ttf-mscorefonts-installer $ winecfg $ wget http://www.kegel.com/wine/winetricks $ sh winetricks msxml6 gdiplus gecko vcrun2005sp1 vcrun2008 msxml3 atmlib Moved registry export to home folder. Copied "Program Files (x86)\Adobe" to "~/.wine/drive_c/Program Files (x86)/Adobe" "Program Files (x86)\Common Files\Adobe" to "~/.wine/drive_c/Program Files (x86)/Common Files/Adobe" "Documents and Settings\Don\Application Data\Adobe" to "~/.wine/drive_c/users/don/Application Data/Adobe" "Windows\System32\odbcint.dll" to "~/.wine/drive_c/windows/system32/odbcint.dll" ,and lastly "Windows\System32\odbc32.dll" to "~/.wine/drive_c/windows/system32/odbc32.dll". From Terminal, $ wine regedit adobe.reg. Right clicked on Photoshop.exe and selected "Open with Wine". Got the message "Wine Program Crash, Internal errors - invalid parameters received." So to restate my question, How can I get Photoshop running in Ubuntu 12.04? I'm not sold on doing it in this specific way, I just want to use Photoshop without having to reboot. What's the best way to make this happen? Edit: I do not have the installation CD, no.

    Read the article

  • SQL SERVER – Fix Visual Studio Error : Connections to SQL Server files (.mdf) require SQL Server Express 2005 to function properly. Please verify the installation of the component or download from the URL

    - by pinaldave
    In one of the virtual environment while I was trying to add SQL Server Database (.mdf) file to asp.net project I encountered following error: Connections to SQL Server files (.mdf) require SQL Server Express 2005 to function properly. Please verify the installation of the component or download from the URL:  For a long time I am using SQL Server 2012 but this error was a bit interesting to me. I realize that there should not be any need of the SQL Server 2005 installation. I quickly figured out that I can remove this error if I do as mentioned below: Open Microsoft Visual Studio Select Tools >> Options >> Database Tools >> Data Connections Enter the name of an installed instance in “SQL Server Instance Name” field. Click OK If you do not know the instance name, you can follow either of the options. 1) Use the command line sqlcmd utility 2) Using SQL Server Management Studio Is there any other way to resolve this error? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Error Messages, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: sqlcmd, Visual Studio

    Read the article

  • Developer momentum on open source projects

    - by sashang
    Hi I've been struggling to develop momentum contributing to open source projects. I have in the past tried with gcc and contributed a fix to libstdc++ but it was a once off and even though I spent months in my spare time on the dev mailing list and reading through things I just never seemed to develop any momentum with the code. Eventually I unsubscribed and got my free time back and uncluttered my mailbox. Like a lot of people I have some little open source defunct projects lying around on the net, but they're not large and I'm the only contributor. At the moment I'm more interested in contributing to a large open source project and want to know how people got started because I find it difficult while working full time to develop any momentum with the code base. Other more regular contributors, who are on the project full-time, are able to make changes at will and as result enter that positive feedback cycle where they understand the code and also know where it's heading. It makes the barrier to entry higher for those that come along later. My questions are to people who actively contribute to large opensource projects, like the Linux kernel, or gcc or clang/llvm or anything else with say a developer head count of more than 10. How did you get started? Was there a large chunk of time in your life that you just could dedicate to working on the project? I know in Linus's case he had a chunk of time (6 months) to get it started. What barriers to entry did you encounter? Can you describe the initial stages of the time spent with the project, from when you had little understanding of the code to when you understood enough to commit regularly. Thanks

    Read the article

  • Configure Unity Lenses and what they search

    - by Sindre
    I'm using Ubuntu 12.10. I've read about a lense (ppa:pydave/unity-lenses) that you can replace with the original files and folders lense, so you can search all your files. Instead of the current which only search used/recently used files and programs. I couldn't get this to work with 12.10, got a bunch of errors when I tried adding the ppa. I would like to set up a lens that can search all my files and folders (from all of my 3 hdd's), one that search through my videos (ability to specify which folders) and the same for music. So basically I would like to set up three specific lenses that each get a set of specified folders that they search through. If this is not possible, is there atleast a way to configure the current Files and Folders lense to ignore certain folders? I don't like when my dash shows files that I don't want to be shown. I should add that I'm completely new to Ubuntu and I apologize beforehand if this information could easily be found. But I wasn't able to find something like this. Edit: I found out how I can use the Privacy application to ignore what I want, so that's sorted now. Sorry for not researching it more. But my question regarding the lenses still stand. All help is greatly appreciated.

    Read the article

  • Developing momentum on open source projects

    - by sashang
    Hi I've been struggling to develop momentum contributing to open source projects. I have in the past tried with gcc and contributed a fix to libstdc++ but it was a once off and even though I spent months in my spare time on the dev mailing list and reading through things I just never seemed to develop any momentum with the code. Eventually I unsubscribed and got my free time back and uncluttered my mailbox. Like a lot of people I have some little open source defunct projects lying around on the net, but they're not large and I'm the only contributor. At the moment I'm more interested in contributing to a large open source project and want to know how people got started because I find it difficult while working full time to develop any momentum with the code base. Other more regular contributors, who are on the project full-time, are able to make changes at will and as result enter that positive feedback cycle where they understand the code and also know where it's heading. It makes the barrier to entry higher for those that come along later. My questions are to people who actively contribute to large opensource projects, like the Linux kernel, or gcc or clang/llvm or anything else with say a developer head count of more than 10. How did you get started? Was there a large chunk of time in your life that you just could dedicate to working on the project? I know in Linus's case he had a chunk of time (6 months) to get it started. What barriers to entry did you encounter? Can you describe the initial stages of the time spent with the project, from when you had little understanding of the code to when you understood enough to commit regularly. Thanks

    Read the article

  • Configuring SQL Server Management Studio to use Windows Integrated Authentication &hellip; from non-

    - by Enrique Lima
    Did you know you can pass your Windows credentials to SQL Server even when working from a workstation that is not joined to a domain? Here is how … From Start, then click All Programs, find Microsoft SQL Server (version 2005 or 2008). Once there, do a right-click on SQL Server Management Studio, then click on Properties Now, follow below to modify the entry for Target: Now the real task (we will be using the runas command) … Modify the shortcut’s target as follows, and remember to replace <domain\user> with the values that correspond to your environment : x64 SQL Server 2008 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\binn\VSShell\Common7\IDE\Ssms.exe -nosplash" SQL Server 2005 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files (x86)\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe -nosplash" x86 SQL Server 2008 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files\Microsoft SQL Server\100\Tools\binn\VSShell\Common7\IDE\Ssms.exe -nosplash" SQL Server 2005 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe -nosplash" Since we modified the shortcut, we will need to fix the icon for SSMS.  We will fix it by pressing the Change Icon… button and pointing to the original “icon” providers. It is the executables for SSMS that hold the icon information, so we need to point to … x64 SQL Server 2008 %ProgramFiles% (x86)\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\Ssms.exe SQL Server 2005 C:\Program Files (x86)\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe x86 SQL Server 2008 %ProgramFiles%\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\Ssms.exe SQL Server 2005 C:\Program Files\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe When you start SSMS from a modified shortcut, you’ll be prompted for your domain password: SSMS will show up stating a different account in the username box, but the parameters from the configuration you are doing above do work and will pass on correctly.

    Read the article

  • Linking Libraries in iOS?

    - by Bob Dole
    This is probably a totally noob question but I have missing links in my mind when thinking about linking libraries in iOS. I usually just add a new library that's been cross compiled and set the build and linker paths without really know what I'm doing. I'm hoping someone can help me fill in some gaps. Let's take the OpenCV library for instance. I have this totally working btw because of a really well written tutorial( http://niw.at/articles/2009/03/14/using-opencv-on-iphone/en ), but I'm just wanting to know what is exactly going on. What I'm thinking is happening is that when I build OpenCV for iOS is that your creating object code that gets placed in the .a files. This object code is just the implementation files( .m ) compiled. One reason you would want to do this is to make it hard to see the source code and so that you don't have to compile that source code every time. The .h files won't be put in the library ( .a ). You include the .h in your source files and these header files communicate with the object code library ( .a ) in some way. You also have to include the header files for your library in the Build Path and the Library itself in the Linker Path. So, is the way I view linking libraries correct? If , not can someone correct me on this ?

    Read the article

  • Bundling in visual studio 2012 for web optimization

    - by Jalpesh P. Vadgama
    I have been writing a series of posts about Visual Studio 2012 features. This series describes what are the new features in the Visual Studio 2012. This post will also be part of Visual Studio 2012 feature series. As we know now days web applications or site are providing more and more features and due to that we have include lots of JavaScript and CSS files in our web application.So once we load site then we will have all the JavaScript  js files and CSS files loaded in the browsers and If you have lots of JavaScript files then its consumes lots of time when browser request them. Following images show the same situation over there.   Here you can see total 25 files loaded into the system and it's almost more than 1MB of total size. As we need to have our web application of site very responsive and need to have high performance application/site, this will be a performance bottleneck to our site. In situation like this, the bundling feature of Visual Studio 2012 and ASP.NET 4.5 comes very handy. With the help of this feature we do optimization there and we can increase performance of our application. To enable this feature in Visual Studio 2012 we just made debug=”false” in web.config of our application like following. Now once you enable this feature and run this application in the browser to see your traffic it will have less items like following. As you can see in the above image there are only 8 items. So after enabling bundling it will automatically convert all js and css files into the one request. Isn’t that cool feature? This feature will surely going to have great impact on performance. Hope you like it. Stay tuned for more.. Till then happy programming!!

    Read the article

  • Automatic syncing USB HD drive to local HD (different computers)

    - by luri
    I have a desktop PC at work, and a laptop at home.... or elsewhere. What I want to do is use a USB HD to store my documents (about 130 Gb, maybe more). That would serve as backup and also port my files to my laptop. I'd like any of both computers to automatically sync all files there with local copies, so that I can work at either of them and keep updated copies of everything in both (plus the USB drive, which would allow me to work in other computers, too, apart from being another backup). Dropbox ins't a solution for me, due to pricing and 100Gb limit. The workflow would be as follows, to clear things up: I work on PC1. Changes in files are automatically synced to USB whenever a file is modified. I go home and boot PC2. I plug the usb drive and local files are synced (if changed) with the most recent usb copies. While I work at PC2, again, changes in files spread to my USB drive. Whenever I go to PC1 again, I plug my usb and again everything synces. So the questions would be: a) Am I crazy? b) Can it be done? c) Will I have any file conflicts (provided I'm the only one that will modify the files)?

    Read the article

  • Companies and Ships

    - by TechnicalWriting
    I have worked for small, medium, large, and extra large companies and they have something in common with ships. These metaphors have been used before, I know, but I will have a go at them.The small company is like a speed boat, exciting and fast, and can turn on a dime, literally. Captain and crew share a lot of the work. A speed boat has a short range and needs to refuel a lot. It has difficulty getting through bad weather. (Small companies often live quarter to quarter. By the way, if a larger company is living quarter to quarter, it is taking on water.)The medium company is is like a battleship. It can maneuver, has a longer range, and the crew is focused on its mission. Its main concern are the other battleships trying to blow it out of the water, but it can respond quickly. Bad weather can jostle it, but it can get through most storms.The large company is like an aircraft carrier; a floating city. It is well-provisioned and can carry a specialized load for a very long range. Because of its size and complexity, it has to be well-organized to be effective and most of its functions are specialized (with little to no functional cross-over). There are many divisions and layers between Captain and crew. It is not very maneuverable; it has to set its course well in advance and have a plan of action.The extra large company is like a cruise liner. It also has to be well-organized and changes in direction are often slow. Some of the people are hard at work behind the scenes to run the ship; others can be along for the ride. They sail the same routes over and over again (often happily) with the occasional cosmetic face-lift to the ship and entertainment. It should stay in warm, friendly waters and avoid risky speed through fields of ice bergs.I have enjoyed my career on the various Ships of Technical Writing, but I get the most of my juice from the battleship where I am closer to the campaign and my contributions have the greater impact on success.Mark Metcalfewww.linkedin.com/in/MarkMetcalfe

    Read the article

  • Can't update packages or use Ubuntu Software Center (E: Some index files failed to download. They have been ignored, or old ones used instead.)

    - by user94189
    I get the following errors when trying to run the auto Update Manager Could not initialize the package information An unresolvable problem occurred while initializing the package information. Please report this bug against the 'update-manager' package and include the following error message: 'E:Encountered a section with no Package: header, E:Problem with MergeList /var/lib/apt/lists/ppa.launchpad.net_bugs-sehe_zfs-fuse_ubuntu_dists_precise_main_binary-amd64_Packages, E:The package lists or status file could not be parsed or opened.' When I run sudo apt-get update, I get these errors W: GPG error: http://us.archive.ubuntu.com precise Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http://ppa.launchpad.net precise Release: The following signatures were invalid: BADSIG 1FFD34C9EB13C954 Launchpad CoverGloobus W: GPG error: http://ppa.launchpad.net precise Release: The following signatures were invalid: BADSIG 4A67F94CBF965FF5 Launchpad PPA for Happy-Neko W: Failed to fetch http://ppa.launchpad.net/bugs-sehe/zfs-fuse/ubuntu/dists/precise/main/source/Sources 404 Not Found W: Failed to fetch http://ppa.launchpad.net/bugs-sehe/zfs-fuse/ubuntu/dists/precise/main/binary-amd64/Packages 404 Not Found W: Failed to fetch http://ppa.launchpad.net/bugs-sehe/zfs-fuse/ubuntu/dists/precise/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • IZWebFileManager

    - by csharp-source.net
    IZWebFileManager is featured File Manager control for ASP.NET 2 compatible with most-used browsers like MS Internet Explorer and Firefox. Features: * Copying, moving, renaming, deletion of files and folders; * Ability to work (copy, move, delete) with several files at once; * File upload; * Easy duplication of files and folders; * Right-click context menu (Windows Explorer like); * Common shortcuts supported. Arrow Keys, F5 - refresh, F2 - rename, Enter - default action, Delete; * Permission control: you can forbid uploading, renaming or deletion of files and folders. You can limit size of files that can be uploaded and restrict types of files which could be uploaded by their extensions. For example, you can let users upload pictures (gifs and jpgs) only with the size not more than 50KB. * Multilingual interface. English, Russian and Hebrew are already supported. Other languages can be added without even recompilation of the component; * Full Unicode and Right-to-Left support; * All major browsers supported. The component has been tested and works fine in Netscape 8.0, Firefox 1.5, IE 6.0 (SP2); * Optimized and compiled for .Net Framework 2.0; * Totally easy to install and to use. No additional configuration in web.config need. Deployed with *.dll only; * XHTML capability.

    Read the article

  • Why is apt-cache so slow?

    - by Damn Terminal
    After upgrade to Trusty (14.04) from Saucy (13.10), all apt operations are very slow. Even those that do not include downloading anything, or connecting to any servers. For example, displaying the apt policy # time apt-cache policy [...] real 0m8.951s user 0m5.069s sys 0m3.861s takes almost ten seconds! Mostly a weird lag right after issuing the command. And it's the same even if I issue the same command again. On another system it doesn't take a tenth of a second real 0m0.096s user 0m0.070s sys 0m0.023s The other system is a little beefier but there was no noticeable difference before the upgrade. It's the same with apt-get, anything apt-related. How do I find out the source of this lag and fix it? Additional info: # cat /etc/nsswitch.conf # /etc/nsswitch.conf # # Example configuration of GNU Name Service Switch functionality. # If you have the `glibc-doc-reference' and `info' packages installed, try: # `info libc "Name Service Switch"' for information about this file. passwd: compat group: compat shadow: compat hosts: files dns networks: files protocols: db files services: db files ethers: db files rpc: db files netgroup: nis BTW is my understanding of how apt-cache works correct? It doesn't make any network connections when I run apt-cache policy, right? In case I'm wrong and it matters, here are my sources https://gist.github.com/anonymous/02920270ff68e23fc3ec

    Read the article

  • Restoring backup with Deja Dup from external HD

    - by widgg
    So, here's the problem that I have with restoring. First, I backed up like everything. So I had place on the external HD for only one copy. I had to reinstall everything, but I didn't worry because I had a backup. But now, restoring doesn't work and it starts to be annoying. So, I right click "Restore missing files...". Then I have the popup window from Deja Dup asking where is the backup. So, I select the external HD and either put nothing in "folder" or just put ".". Thinking that it should be the base to look for backups. In both cases, after a while scanning, I get: The Volume "Filesystem root" has only 139.7mb disk space remaining. But my partition "/home" has 799.6Gb free. Also, I just want to restore some files, I don't need all of them. On my external HD, there's a file called: duplicity-full.20120514T220834Z.manifest. A text file. In it, I can see that everything is partitions in files of 52mb. So, small files are in one archive while very large files are split in multiple one. But I can see the exact list of file that I have. So, I'm guessing that my backup is intact. What am I doing wrong ? Is possible that it failes, even to list all the files, because I don't have enough space left on my ExtHD ? Why can't duplicity used another location to do that ?

    Read the article

  • Developing my momentum on open source projects

    - by sashang
    Hi I've been struggling to develop momentum contributing to open source projects. I have in the past tried with gcc and contributed a fix to libstdc++ but it was a once off and even though I spent months in my spare time on the dev mailing list and reading through things I just never seemed to develop any momentum with the code. Eventually I unsubscribed and got my free time back and uncluttered my mailbox. Like a lot of people I have some little open source defunct projects lying around on the net, but they're not large and I'm the only contributor. At the moment I'm more interested in contributing to a large open source project and want to know how people got started because I find it difficult while working full time to develop any momentum with the code base. Other more regular contributors, who are on the project full-time, are able to make changes at will and as result enter that positive feedback cycle where they understand the code and also know where it's heading. It makes the barrier to entry higher for those that come along later. My questions are to people who actively contribute to large opensource projects, like the Linux kernel, or gcc or clang/llvm or anything else with say a developer head count of more than 10. How did you get started? Was there a large chunk of time in your life that you just could dedicate to working on the project? I know in Linus's case he had a chunk of time (6 months) to get it started. What barriers to entry did you encounter? Can you describe the initial stages of the time spent with the project, from when you had little understanding of the code to when you understood enough to commit regularly. Thanks

    Read the article

  • Good podcasting solution?

    - by burnso
    I simply ask for the best, most common and simple way to set up a podcast? Please, I need an answer so I can close this question? I run a joomla-based website for a small church and now need a simple, cheap and effective solution for an audio podcast. I am looking for a solution that will do the following: User uploads audio files to service (preferably not to our own site) that is cheap, fast and simple to use. Dropbox for example? Files are easily embeddable into Joomla website articles to be played on the spot (through simple-to-use and media player). Preferably through RSS feeds (to make it easy to update every week). Files are downloadable. Files are viewable on iPhones and other smartphones. Solution can be broadcasted via iTunes. Solution doesn't need a lot of extra, new third party software. I'd rather keep it simple and familiar than have it be a complete new system but with a steep learning curve. At the moment we use vimeo to host the podcast, through video files. But we'd like to move to something simpler that doesn't involve a series of difficult steps to upload the files to the web.

    Read the article

  • What are some best practices for minimizing code?

    - by CrystalBlue
    While maintaining the sites our development team has created, we have come across include files and plugins that have proven to be very useful to more then one part of our applications. Most of these modules have come with two different files, a normal source file and a min file. Seeing that the performance and speed of a page can be increased by minimizing the size of the file, we're looking into doing that to our pages as well. The problem that we run into is a lot of our normal pages (written in ASP classic) is a mix of HTML, ASP, Javascript, CSS, and include files. We have some pages that have their JS both in include files and in the page, depending on if the function is only really used in that page or if it's used in many other pages. For example, we have a common.js and an ajax.js file, both are used in a lot of pages, but not all of them. As well as having some functions in a page that doesn't really make sense to put into one master page. What I have seen a few other people do online is use one master JS file and place all of their javascript into that, minify it, gzip it, and only use that on their production server. Again, this would be great, but I don't know if that fully works for our purposes. What I'm looking for is some direction to go with on this. I'm in favor of taking all of our JS and putting it in one include file, and just having it included in every page that is hit. However, not every page we have needs every bit of JS. So would it be worth the compilation and minifying of the files into one master file and include it everywhere, or would it be better to minify all other files and still include them on a need-to-use basis?

    Read the article

  • Algorithms for Data Redundancy and Failover for distributed storage system?

    - by kennetham
    I'm building a distributed storage system that works with different storage sizes. For instance, my storage devices have sizes of 50GB, 70GB, 150GB, 250GB, 1000GB, 5 storage systems in one system. My application will store any files to the storage system. Question: How can I build a distributed storage with the idea of data redundancy and fail-over to store documents, videos, any type of files at the same time ensuring that should one of any storage devices fail, there would be another copy of these files on another storage device. However, the concern is, 50GB of storage can only store this maximum number of files as compared to 70GB, 150GB etc. With one storage in mind, bringing 5 storage systems like a cloud storage, is there any logical way to distribute or store the files through my application? How do I ensure data redundancy through different storage sizes? Is there any algorithm to collate multiple blob files into a single file archive? What is the best solution for one cloud storage with multiple different storage sizes? I open this topic with the objective of discussing the best way to implement this idea, assuming simplicity, what are the issues of this implementation, performance measurements and discussion of the limitations.

    Read the article

  • The [2] table entry '[3]' has no associated entry in the Media table. (error 2602)

    - by derekf
    Coworker started getting the above message in the event log and as dialog during install.  Argument [2] was File and argument [3] was a specific file. Error dialog read   Product: (app name) -- The installer has encountered an unexpected error installing this package. This may indicate a problem with this package. The error code is 2602. Package was a vendor-provided MSI that had been installed administratively, and then a patch (.msp) applied to the administrative install point. With some digging we found that the MSI still had the entries in the media table pointing at the CAB files, and that there were several files at the end of the sequence that did not have corresponding entries in the Media table (last sequence 990 in Media table, last entry in File table had sequence 994).  Attributes on files in the File table all had the msidbFileAttributesCompressed (&16384) attribute set, so they were all expecting to be within the CAB files, but since this was an admin install there were no CAB files. Resolved by clearing the Media table (replace with a single entry: Disk ID 1, LastSequence 994) and going through the file table and subtracting 8192 from each entry to mark files as not compressed.  Tested and worked.

    Read the article

  • Visual Studio 2008 doesn't create *.refresh files for external DLL references... what am I missing?

    - by Cory Larson
    Hi all-- I've got a question about something that's just been irritating me. A colleague and I are building a support framework for our current client that we want to reference in other projects. The DLL we want as a reference in our project would be an external reference. We're adding it by doing "Add Reference...", then browsing to the location of the .dll. What I want Visual Studio to do is only add the .xml, .pdb, and a .dll.refresh file, but instead it copies the actual .dll (and .xml and .pdb) into the bin. When we rebuild the framework project, the other project that uses its .dll gets all out of whack until we drop and re-add the reference. Everything I've read online says that VS2008 is supposed to create the .dll.refresh files for you, but it never does. Any ideas? Am I missing something or doing something wrong? At this point I'm ready to add a pre-build event to simply copy the framework .dll into my bin, but the .refresh file seems like less of a hassle if it would just work. Thanks, Cory

    Read the article

  • NUnit Test with WatiN, runs OK from and Dev10, But not from starting NUnit from "C:\Program Files (x

    - by judek.mp
    I have the following code in an Nunit test ... string url = ""; url = @"http://localhost/ClientPortalDev/Account/LogOn"; ieStaticInstanceHelper = new IEStaticInstanceHelper(); ieStaticInstanceHelper.IE = new IE(url); ieStaticInstanceHelper.IE.TextField(Find.ById("UserName")).TypeText("abc"); ieStaticInstanceHelper.IE.TextField(Find.ById("Password")).TypeText("defg"); ieStaticInstanceHelper.IE.Button(Find.ById("submit")).Click(); ieStaticInstanceHelper.IE.Close(); On right-clicking the project in Dev10 and choosing [Test With][NUnit 2.5], this test code runs with no problems ( Iahve TestDriven installed ...) When opening the NUnit from C:\Program Files (x86)\NUnit 2.5.5\bin\net-2.0\nunit.exe" and then opening my test dll, the following text is reported in NUnit Errors and failures ... Elf.Uat.ClientPortalLoginFeature.LoginAsWellKnownUserShouldSucceed: System.Runtime.InteropServices.COMException : Error HRESULT E_FAIL has been returned from a call to a COM component. As an Aside ... Right-Clicking the source cs file in Dev10 and choosing Run Test, ... works as well. The above test is actually part of TechTalk.SpecFlow 1.3 step, I have NUnit 2.5.5.10112, installed, I have Watin 20.20 installed, I have ... & in my NUnit.exe.config and I have the following App.config for my test dll... (the start angle brakets habe beem removed ) configuration configSections sectionGroup name="NUnit" section name="TestRunner" type="System.Configuration.NameValueSectionHandler"/ /sectionGroup /configSections NUnit TestRunner add key="ApartmentState" value="STA" / /TestRunner /NUnit appSettings add key="configCheck" value="12345" / /appSettings /configuration Anyone hit this before ? The NUnit test obviously runs in NUnit 2.5.5 of TestDriven but not when run outside of Dev10 and TesDriven ?

    Read the article

  • emacs frustration with web development any working dot-files?

    - by Tony Cruise
    I really liked flexibility of emacs but it is really annoying to make it work. I want to use it for web development html, css, javascript, php. I first tried emacs-starter-kit . It didn't included nXhtml. Also C-g key binding does not work (they call it starter kit but basic key command does not work). I think it is mapped for git control. That's a frustration for a beginner. Then I replaced emacs-starter-kit with nXhtml. At least C-g is working. But code completion sucks, M-tab does not work. I tried code completion from nXhtml menu with no success. Also NXhtml mode did'nt colorized my file if css is mixed with html. Isn't it recommended for mixed html, css,php files. So why it doesnt work?. Why Emacs folks do not aware of convention over configuration? Dam! ship it something works! Please help me before I am getting crazy. I use Ubuntu 10.04 and emacs-snaphot-gtk 23.1.50-1. Please guide me step by step with your working dotfile url. Even I accept I am a dummy, it is really annoying and frustrating to use emacs.

    Read the article

  • attaching js files to one js file but with JQuery error !

    - by uzay95
    Hi, I wanted to create one js file which includes every js files to attach them to the head tag where it is. But i am getting this error Microsoft JScript runtime error: Object expected this is my code: var baseUrl = document.location.protocol + "//" + document.location.host + '/yabant/'; // To find root path with virtual directory function ResolveUrl(url) { if (url.indexOf("~/") == 0) { url = baseUrl + url.substring(2); } return url; } // JS dosyalarinin tek noktadan yönetilmesi function addJavascript(jsname, pos) { var th = document.getElementsByTagName(pos)[0]; var s = document.createElement('script'); s.setAttribute('type', 'text/javascript'); s.setAttribute('src', jsname); th.appendChild(s); } addJavascript(ResolveUrl('~/js/1_jquery-1.4.2.min.js'), 'head'); $(document).ready(function() { addJavascript(ResolveUrl('~/js/5_json_parse.js'), 'head'); addJavascript(ResolveUrl('~/js/3_jquery.colorbox-min.js'), 'head'); addJavascript(ResolveUrl('~/js/4_AjaxErrorHandling.js'), 'head'); addJavascript(ResolveUrl('~/js/6_jsSiniflar.js'), 'head'); addJavascript(ResolveUrl('~/js/yabanYeni.js'), 'head'); addJavascript(ResolveUrl('~/js/7_ResimBul.js'), 'head'); addJavascript(ResolveUrl('~/js/8_HaberEkle.js'), 'head'); addJavascript(ResolveUrl('~/js/9_etiketIslemleri.js'), 'head'); addJavascript(ResolveUrl('~/js/bugun.js'), 'head'); addJavascript(ResolveUrl('~/js/yaban.js'), 'head'); addJavascript(ResolveUrl('~/embed/bitgravity/functions.js'), 'head'); }); Paths are right. I wanted to show you folder structure and watch panel: Any help would be greatly appreciated.

    Read the article

  • Delphi 2010 - Source files randomly become read-only in editor?

    - by Justin
    Does anyone else have this problem or is my Delphi cursed somehow? I'll have a bunch of forms and files open in tabs in the editor and I'll be typing away and then suddenly everything stops - my .pas file has, seemingly at random, become read-only. Sometimes I can just right-click the tab at the top and uncheck "Read-Only" and continue, but sometimes this option is checked and greyed-out (disabled), meaning I can't uncheck it and I can't make any further edits to the file. This too seems to be random. In the latter case, the only solution is to save the file in question, which works, despite Delphi's assertion that the file is read-only, close its tab in the editor, and re-open it. Not catastrophic, really, but it's starting to become annoying. Could it be that I am hitting a keyboard command combination accidentally to do this or is this a bug in Delphi? I'm in Delphi 2010, Windows 7. Doubt it's anything to do with installed packages, but if anyone wants the list I'll generate it and attach it here.

    Read the article

  • Which Perl modules can be installed just by copying lib files?

    - by elliot100
    I'm an absolute beginner at Perl, and am trying to use some non-core modules on my shared Linux web host. I have no command line access, only FTP. Host admins will consider installing modules on request, but the ones I want to use are updated frequently (DateTime::TimeZone for example), and I'd prefer to have control over exactly which version I'm using. By experimentation, I've found some modules can be installed by copying files from the module's lib directory to a directory on the host, and using use lib "local_path"; in my script, i.e. no compiling is required to install (DateTime and DateTime::TimeZone again). How can I tell whether this is the case for a particular module? I realise I'll have to resolve dependencies myself. Additionally: if I wanted to be able to install any module, including those which require compiling, what would I be looking for in terms of hosting? I'm guessing at the moment I share a VM with several others and the minimum provision I'd need would be a dedicated VM with shell access?

    Read the article

< Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >