Search Results

Search found 4686 results on 188 pages for 'folders'.

Page 88/188 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • problem connecting to magento connect

    - by amir
    hi, I'm using magento 1.4.0 and when I try to get to magento connect and download a plugin the page will say Error: Please check for sufficient write file permissions Your Magento folder does not have sufficient write permissions, which this web based downloader requires. If you wish to proceed downloading Magento packages online, please set all Magento folders to have writable permission for the web server user (example: apache) and press the "Refresh" button to try again. does anyone know how I can fix this problem, thanks Update: the plugin I'm trying to use is MagentoPycho light box so I unpacked the folder into the app/code/local but it still doesn't show in the admin area

    Read the article

  • How to stop Thunar being default file browser

    - by Charles Kane
    On a relatively new 11.04 installation Thunar has become the default file browser simply by using it! Whilst I can open Nautilus easily enough, I'd rather it remained as the default, especially when I choose to view files in dual pane. The only action that I can pinpoint that might have given over my files and folders to Thunar is making Nautilus into Nautilus-Elementary (oh and Unity carked it so I reverted rather unwillingly to Classic, glad I did it is so much more stable and this is my production machine and Unity acts as if it is early alpha as far as I can tell!)

    Read the article

  • Cleaning up a folder structure from Visual Studio artifacts from the shell

    A few years ago I wrote a post that showed how to write a NAnt script to clean a folder structure from the artifacts folders used by Visual Studio. Today what I wanted to show you is a way that doesnt require NAnt installed on your computer, but that uses just a very simple command for Windows shell. Actually, its just a very tiny variation of the same command that Jon Galloway wrote to clean a folder structure from SVN files. But without further ado, here it is: Windows Registry Editor...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Ubuntu One using 500 MB memory also when idle

    - by cdysthe
    I'm a Dropbox convert (I hope!), but after having used Ubuntu One for a couple of weeks I notice a few differences from Dropbox. The most glaring difference is that the sync daemon constantly takes 500MB ram on my system (Ubuntu 12.04 x64). It hogs this amount of memory as soon as I log in, does it's initial sync/check but keeps the memory. All in all it seems to me that Ubuntu One uses more system resources than Dropbox. I am syncing the same folders and files with Ubuntu One as I was with Dropbox. Also, afte I log in Ubuntu One grids at 100% CPU for at least five minutes which can be annoying on a laptop, but is not a showstopper. I'm wondering if this is a problem on my system, or if Ubuntu One is expected to use that amount of memory even when idle?

    Read the article

  • How to get pngcrush to overwrite original files?

    - by DisgruntledGoat
    I've read through man pngcrush and it seems that there is no way to crush a PNG file and save it over the original. I want to compress several folders worth of PNGs so it would be useful to do it all with one command! Currently I am doing pngcrush -q -d . *.png then manually cut-pasting the files from the tmp directory to the original folder. So I guess using mv might be the best way to go? Any better ideas?

    Read the article

  • Switching CSS to use asset pipeline in Rails?

    - by John
    I have a lot of legacy CSS files from what was a Rails 2.x app that got upgraded to Rails 3.2.8, and I want to switch over to using the Rails asset pipeline for stylesheets. The issue is, the CSS stuff is messy in terms of huge lines of code, duplicate file names, and unorganized folder structure. After looking through individual pages, and trying to add individual stylesheets and folders into the asset pipeline and spending some cycles debugging, I realized there's probably a better approach. Is there a way to test to make sure the old CSS matches up with the asset pipeline CSS? What are some good tools for testing and debugging CSS?

    Read the article

  • Prevent anonymous access to form and application pages in SharePoint 2010

    - by shehan
    When you create a Publishing site that has anonymous access enabled, you will notice that anonymous users will not be able to access pages that reside in the “_layouts” virtual directory (e.g. http://siteX/_layouts/viewlsts.aspx). This is because the publishing infrastructure activates a hidden feature that prevents anonymous users from accessing these types of pages. However, if you were to create a site collection based of  Blank Site Template, you would notice that these pages are accessible by anonymous users. The fix is quite simple. There is a hidden feature that you would need to manually activate via stsadm. The feature is called “ViewFormPagesLockDown” (and is available in the Features folders in the 14 hive) To activate it: stsadm -o activatefeature -filename ViewFormPagesLockDown\feature.xml -url http://ServerName Once activated anonymous users will be promoted to enter credentials when they try to access form and application pages. The feature can also be deactivated for publishing sites that have it automatically turned on.   Technorati Tags: SharePoint 2010,anonymous,lockdown,pages,security

    Read the article

  • Moving screenshots from 1 folder to another instantly

    - by Frank
    I am hosting a gaming site and once in a while the server automatically creates a screenshot in the servers folder. Unfortunatly it is NOT configurable in the server settings where it puts these screenshots, it just allways dumps the PNG file in the server config folder. My folder structure is for example as follows: /home/Game/Server1/ now, what I would like to achieve is that once the server creates a screenshot in 1 of these server folders (I have multiple), the operating system moves the screenshot IMMEDIATLY (which is ALLWAYS a *.png file) to the webserver folder, for example: /var/www/Server1/filename.png So that players can see the screenshot on the website. Anyone any idea on how I can tackle this problem the smartest way? Please note that my ideal situation would be if the PNG file is moved immediatly after creation. Thanks for your help. Frank

    Read the article

  • How do I change the default .htm file icon?

    - by Michael Clayton
    I really enjoy the look of UBUNTU. The only thing that I want to change is the default icon used for .html (.htm) files. I want to use the icon /usr/lib/firefox/browser/icons/mozicon128.png instead. I do not want to change any other visual element. Is there a practical way to accomplish this small change? edit: @Mitch, I've used assogiate in the past and although I was able to change the icon used for .mht files, I could not get it to change the .htm icon. @Anwar Shah, thanks for the information. I wish that it would work for me. Running 13.10 x86, after I do the copy of the icons, in the folders are a bunch of links to .svg files not actual graphics files. It does not appear that the second copy actually does anything on my system.

    Read the article

  • How does Ubuntu One sync two machines with identical file content?

    - by user27449
    I have a notebook and a desktop computer, both running Ubuntu 11.10. I used to sync between the two with the help of Unison, so both computers have identical content in the Documents folder. I decided to try UbuntuOne. My question is, if I activate UbuntuOne for the two machines for the folders with identical contents, will UbuntuOne be able to recognise that, or will it sync to the cloud everything twice (and then down on the other machine). To put it another way, will I end up having two copies of everything on the machines and on the cloud, and therefore should delete the identical files on one of the machines before activating UbuntuOne, or not. Thank you, and if there is already something on the net about this, I'd be glad if somebody posted the link here.

    Read the article

  • install Cirrus Logic cs46xx (audio card) drivers

    - by Aikanáro
    I have two sounds cards, one is the on-board (it's VIA) the other is Cirrus Logic cs46xx. This is what lspci shows me: 04:04.0 Multimedia audio controller: Cirrus Logic CS 4614/22/24/30 [CrystalClear SoundFusion Audio Accelerator] (rev 01) It only show the cirrus logic, cause I disable the VIA card through BIOS. This page: http://es.driverscollection.com/?file_id=13152 gives me instructions to install it, but I can follow them because the folders indicates in the page do not matches with the ones that I see in my system. The alsa page: http://alsa-project.org/main/index.php/Matrix:Module-cs46xx, also give me instructions, but I don't understand it. For example, they say: type in a terminal: ./configure but don't say in what directory. I think that isn't instructions for begginers... Right now I can't heard anything. I decide to disable the VIA audio card, cause I've read they don't get along with linux, although i use the integrate VIA video card. I have ubuntu 11.10

    Read the article

  • Best way to distribute graphics, audio and levels with an SDL game?

    - by Kristopher
    I'm working on finishing up a game written in C++ with SDL I've been working on for awhile, and I'm starting to ponder how I'm going to distribute it. It has hundreds of images that are loaded and used throughout the game, as well as a couple dozen .wav files for audio effects. What is the best way to distribute these? Should I just include the folders with all the files? Or is there a way I can package them into a single file, then open and extract them in my application? What's the best way to go about this?

    Read the article

  • Bitbucket and a small development house

    - by Marlon
    I am in the process of finally rolling Mercurial as our version control system at work. This is a huge deal for everyone as, shockingly, they have never used a VCS. After months of putting the bug in management's ears, they finally saw the light and now realise how much better it is than working with a network of shared folders! In the process of rolling this out, I am thinking of different strategies to manage our stuff and I am leaning towards using Bitbucket as our "central" repository. The projects in Bitbucket will solely be private projects and everyone will push and pull from there. I am open to different suggestions, but has anyone got a similar setup? If so, what caveats have you encountered?

    Read the article

  • Using branchs for a mini project or module of project: Good practice?

    - by TheLQ
    In my repo I have 3 closely related mini projects: 1 server and 2 clients. They are all quite small (<3 files each). Since they are so small and so closely related I just dropped them in folders in one single repo. However now that I know I can't clone a single directory in my VCS of choice (Mercurial), I'm considering splitting them up. However I'm confused about general best practice: Is it okay to put different small projects in different branches, or should they all go in different repos? I'm currently leaning towards branching since I can't easily splice out the file history of the different projects but then your using a feature in a way it wasn't meant to be used.

    Read the article

  • How to Mount a Hard Drive as a Folder on Your Windows PC

    - by Taylor Gibb
    Getting a new drive is always exiting, but having 6 or 7 drives show up in My Computer isnt always ideal. Using this trick you can make your drives appear as folders on a another drive. Logically it will look like its one drive but any files in that folder will physically be on another drive. Note: This will only work with NTFS formatted drives. Press the Windows Key and R to bring up a run box, type diskmgmt.msc and press enter. How to Make the Kindle Fire Silk Browser *Actually* Fast! Amazon’s New Kindle Fire Tablet: the How-To Geek Review HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS

    Read the article

  • What are the files pushed to MDS?

    - by harsh.singla
    All files which are under AIAComponents will move to MDS. This contains EnterpriseObjectLibrary, EnterpriseBusinessServiceLibrary, ApplicationObjectLibrary, ApplicationBusinessServiceLibrary, B2BObjectLibrary, ExtensionServiceLibrary, and UtilityArtifacts. Also there are some common transformation (.xsl) files, which are kept under Transformations folder, moved to MDS. AIAConfigurationProperties.xml file will be there in MDS. Every cross reference (.xref) object will also be there. Every Domain value Map (.dvm) will also be there. Common fault policy, which by default included in composite during composite generation, if a user does not choose to customize fault policy. All these files are location under AIAMetaData directory and then placed in their respective folders. We are planning to put Error handling and BSR systems related data also to MDS.

    Read the article

  • Unable to sync files not located in the Ubuntu One folder

    - by pst007x
    I have the option to sync files in nautilus, however when the option is selected nothing happens. I am able to SLOWLY sync files in the Ubuntu One folder (unless in sub-folders), but only there. Why give the option if it does not work? This really is still beta, so to charge for this service is terrible. I am a little frustrated at the moment, I have tried all sorts; removing accounts to a fresh Ubuntu install! I am at a loss :-( Can anyone else sync files outside the Ubuntu One folder, if so how? Thanks

    Read the article

  • MP4 files show up but won't stream to xbox 360

    - by Greg
    I set up a basic media server to stream to my 360 using uShare. Here are the instructions I used: http://linuxexpresso.wordpress.com/2011/01/02/howto-ubuntu-upnp-server-to-xbox-360/. I can stream avi files fine but I cannot stream mp4s. When I go to videos on the xbox, I can see all of the videos and folders but when I click play for an mp4 nothing happens. On my ubuntu desktop I can click on the mp4 file and it plays fine. And if I take that file, stick it on a thumb drive and plug it directly into the xbox the mp4 will play off the thumb drive. I'm lost for why it won't work through ushare. Any ideas?

    Read the article

  • Failed update of Ubuntu 10.10 results in unbootable system

    - by chessweb
    Hi, yesterday I performed an automatic security update suggested by the update manager on my virtualized (with VirtualBox on a Windows 7 host) Ubuntu 10.10 installation. The update somehow failed and left me with an unbootable system. When I try to boot, I am told that various folders, files, and what not are missing. Then the system drops into a busybox and leaves me with an (initramfs) prompt. This happens with all kernels I get offered by GRUB, although the error messages are quite different from kernel to kernel. Well, the short of it is this: I don't have the slightest idea on how to get back to a working system and this site is the final straw I'm willing to grab. A complete disaster like this following an update initiated and executed by the system is unheard of in Windows-land; at least I haven't heard of it, yet, and therefore I am going to abandon Ubuntu and Linux altogeteher, if there is no remedy. Regards, RSel

    Read the article

  • Best practices for managing deployment of code from dev to production servers?

    - by crosenblum
    I am hoping to find an easy tool or method, that allow's managing our code deployment. Here are the features I hope this solution has: Either web-based or batch file, that given a list of files, will communicate to our production server, to backup those files in different folders, and zip them and put them in a backup code folder. Then it records the name, date/time, and purpose of the deployment. Then it sends the files to their proper spot on the production server. I don't want too complex an interface to doing the deployment's because then they might never use it. Or is what I am asking for too unrealistic? I just know that my self-discipline isn't perfect, and I'd rather have a tool I can rely on to do what needs to be done, then my own memory of what exact steps I have to take every time. How do you guys, make sure everything get's deployed correctly, and have easy rollback in case of any mistakes?

    Read the article

  • Thunar not showing thumbnails in Openbox

    - by RanRag
    I am using openbox 3.5 and my file manager of choice is Thunar but the problem I am facing is thunar not showing thumbnails for folder,video files etc. My Thunar: You can see in the above image that thunar is not showing thumbnails for folders, files but it is showing for pdf and similar is the case for video files no thumbnails. So, what is wrong here am I missing some basic thunar config files. ranrag@ranrag:~$ echo $DESKTOP_SESSION openbox I looked into some forums and they mentioned that there is some dependency of tumbler with thunar but I have latest version of tumbler installed on my system. PS: I already tried thunar preferences but still no luck.

    Read the article

  • Source Control and SQL Development &ndash; Part 3

    - by Ajarn Mark Caldwell
    In parts one and two of this series, I have been specifically focusing on the latest version of SQL Source Control by Red Gate Software.  But I have been doing source-controlled SQL development for years, long before this product was available, and well before Microsoft came out with Database Projects for Visual Studio.  “So, how does that work?” you may wonder.  Well, let me share some of the details of how we do it where I work… The key to this approach is that everything is done via Transact-SQL script files; either natively written T-SQL, or generated.  My preference is to write all my code by hand, which forces you to become better at your SQL syntax.  But if you really prefer to use the Management Studio GUI to make database changes, you can still do that, and then you use the Generate Scripts feature of the GUI to produce T-SQL scripts afterwards, and store those in your source control system.  You can generate scripts for things like stored procedures and views by right-clicking on the database in the Object Explorer, and Choosing Tasks, Generate Scripts (see figure 1 to the left).  You can also do that for the CREATE scripts for tables, but that does not work when you have a table that is already in production, and you need to make just a simple change, such as adding a new column or index.  In this case, you can use the GUI to make the table changes, and then instead of clicking the Save button, click the Generate Change Script button (). Then, once you have saved the change script, go ahead and execute it on your development database to actually make the change.  I believe that it is important to actually execute the script rather than just click the Save button because this is your first test that your change script is working and you didn’t somehow lose a portion of the change. As you can imagine, all this generating of scripts can get tedious and tempting to skip entirely, so again, I would encourage you to just get in the habit of writing your own Transact-SQL code, and then it is just a matter of remembering to save your work, just like you are in the habit of saving changes to a Word or Excel document before you exit the program. So, now that you have all of these script files, what do you do with them?  Well, we organize ours into folders labeled ChangeScripts, Functions, Views, and StoredProcedures, and those folders are loaded into our source control system.  ChangeScripts contains all of the table and index changes, and anything else that is basically a one-time-only execution.  Of course you want to write your scripts with qualifying logic so that if a script were accidentally run more than once in a database, it would not crash nor corrupt anything; but these scripts are really intended to be run only once in a database. Once you have your initial set of scripts loaded into source control, then making changes, such as altering a stored procedure becomes a simple matter of checking out your CREATE PROCEDURE* script, editing it in SSMS, saving the change, executing the script in order to effect the change in your database, and then checking the script back in to source control.  Of course, this is where the lack of integration for source control systems within SSMS becomes an irritation, because this means that in addition to SSMS, I also have my source control client application running to do the check-out and check-in.  And when you have 800+ procedures like we do, that can be quite tedious to locate the procedure I want to change in source control, check it out, then locate the script file in my working folder, open it in SSMS, do the change, save it, and the go back to source control to check in.  Granted, it is not nearly as burdensome as, say, losing your source code and having to rebuild it from memory, or losing the audit trail that good source control systems provide.  It is worth the effort, and this is how I have been doing development for the last several years. Remember that everything that the SQL Server Management Studio does in modifying your database can also be done in plain Transact-SQL code, and this is what you are storing.  And now I have shown you how you can do it all without spending any extra money.  You already have source control, or can get free, open-source source control systems (almost seems like an oxymoron, doesn’t it) and of course Management Studio is free with your SQL Server database engine software. So, whether you spend the money on tools to make it easier, or not, you now have no excuse for not using source control with your SQL development. * In our current model, the scripts for stored procedures and similar database objects are written with an IF EXISTS…DROP… at the top, followed by the CREATE PROCEDURE… section, and that followed by a section that assigns permissions.  This allows me to run the same script regardless of whether the procedure previously existed in the database.  If the script was only an ALTER PROCEDURE, then it would fail the first time that procedure was deployed to a database, unless you wrote other code to stub it if it did not exist.  There are a few different ways you could organize your scripts for deployment, each with its own trade-offs, but I think it is absolutely critical that whichever way you organize things, you ensure that the same script is run throughout the deployment cycle, and do not allow customizations to creep in between TEST and PROD.  If you do, then you have broken the integrity of your deployment process because what you deployed to PROD was not exactly the same as what was tested in TEST, so you effectively have now released untested code into PROD.

    Read the article

  • How do I export customized Libreoffice config files?

    - by carestad
    Is this possible? I want to make my own config file for my customizations that I can apply whenever I reinstall my system. For example, Ubuntu's default font color is just stupid. I want it to be BLACK and not dark grey. And I want to turn on autosave every 3rd minute and backup files. Is there a config file that I can change? The .libreoffice/* folders and XML files doesn't make sense, and they don't seem to change when I change stuff in LibreOffice. Could someone please help me out with this? Thanks.

    Read the article

  • Filtering content from response body HTML (mod_security or other WAFs)

    - by Bingo Star
    We have Apache on Linux with mod_security as the Web App Firewall (WAF) layer. To prevent content injections, we have some rules that basically disable a page containing some text patterns from showing up at all. For example, if an HTML page on webserver has slur words (because some webmaster may have copied/pasted text without proofreading) the Apache server throws a 406 error. Our requirement now is a little different: we would like to show the page as regular 200, but if such a pattern is matched, we want to strip out the offending content. Not block the entire page. If we had a server side technology we could easily code for this, but sadly this is for a website with 1000s of static html pages. Another solution might have been to do a cronjob of find/replace strings and run them on folders en-masse, maybe, but we don't have access to the file system in this case (different department). We do have control over WAF or Apache rules if any. Any pointers or creative ideas?

    Read the article

  • Geotargeted subfolder questions (Portugal/Brazil and Switzerland)

    - by Lucy
    We are at the beginning of the process to get multilingual versions of a website. We will be using subfolders working off the core domain (eg mydomain.com/fr/), set the geotargeting at webmaster tools and set hreflang attribute. I would really appreciate your help with a couple of questions. 1/Portuguese: we will have a Portuguese language version of the site. Our intention is to use this to cover users in both Portugal AND Brazil. ie, we are not going to do separate folders mydomain.com/pt/ and mydomain.com/br/ Can I use 2 hreflang attributes for this language version to tell Google it covers Brazil AND Portugal? What country code to use for this subfolder? 2/Switzerland Does anyone have best practice advice how to do this? One one hand, the subfolder should be mydomain.com/ch/ but as Switzerland covers 2 language possibilities (French AND German) - what to do? thanks

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >