Search Results

Search found 6589 results on 264 pages for 'photo upload'.

Page 189/264 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • How long does it take to delete a Google Apps for Domain account and allow recreation?

    - by Wil
    Basically, I set up a Google Apps for domain account for one of my clients but upon trying to upload a CSV of users, only half were created and then I kept getting time outs, 404's and other errors which I never saw when setting up another client. I was not sure how, but thought I may have caused an error or there was an error Google Side when the account was created so I thought it may be best to delete the domain and start from scratch. Little did I know, you had to wait to recreate the domain! As far as I can tell from what I have read, there is five days to delete and allow recreation of similar user names, however, I can't see anything that shows how long you have to wait for the domain account. It has now been 5/6 days and I can't recreate it. The cancellation email I got shows "...If at any time you would like to sign-up for Google Apps for this or another domain, you can do so by visiting..." But it does not mention how long!!! I tried emailing Google directly but got no response. Does anyone know?

    Read the article

  • Wirelessly Sync Photos From iPhone To Computer Using CameraSync

    - by Gopinath
    How do you upload photos captured on your iOS device to your computer? By connecting the device using a cable and then syncing up with an app?? Ah..is’nt it a boring way. Here comes CameraSync – an app that lets you wirelessly send your iOS device photos to DropBox, so that you can access on your computer irrespective of the platform (Windows, Mac, Linux). By the way, this app works in the background and syncs the  files without disturbing  you. You don’t like DropBox? CameraSync works with a variety  of cloud services : Flickr, Amazon S3, iDisk, FTP and Box.net. If you looking for a step by step guide on how to setup CameraSync for DropBox then check this post. CameraSync cost $1.99 and runs on iOS4.0+ devices. CameraSync [iTunes App via Lifehacker] This article titled,Wirelessly Sync Photos From iPhone To Computer Using CameraSync, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • What is the "default" software license?

    - by Tesserex
    If I release some code and binaries, but I don't include any license at all with it, what are the legal terms that apply by default (in the US, where I am). I know that I automatically have copyright without doing anything, but what restrictions are there on it? If I upload my code to github and announce it as a free download / contribute at will, then are people allowed to modify and close source my work? I haven't said that they cannot, as a GPL would, but I don't feel that it would by default be acceptable to steal my work either. So what can and cannot people do with code that is freely available, but has absolutely no licensing terms attached? By the way, I know that it would be a good idea for me to pick a license and apply it to my code soon, but I'm still curious about this. Edit Thanks! So it looks like the consensus is that it starts out very restricted, and then my actions imply any further rights. If I just put software on my website with no security, it would be an infringement to download it. If I post a link to that download on a forum, then that would implicitly give permission to use it for free, but not distribute it or its derivatives (but you can modify it for your own use). If I put it on GitHub, then it is conveyed as FOSS. Again, this is probably not codified exactly in law but may be enough to be defensible in court. It's still a good idea to post a complete license to be safe.

    Read the article

  • Need help with ColdFusion and ASP.NET site [closed]

    - by Michael Stone
    To begin, I wasn't too sure how to title this.. I've got a few questions. First off, I've got a very big site that's in ColdFusion and we've been migrating to ASP.NET C# 4.0 the last 8 months. I've got a team of 7 programmers and no one can seem to figure out these answers, not even our senior C# programmer. We're using Team Foundation Server and we can't figure out how to only push up one small change at time. Right now we're stuck to publishing the entire site and it's causing serious issues. We've currently got the site as a Project and not a Website. We're wondering if that's one issue. I actually think it might be a problem. We're also dealing with an issue where we can't access our regular folders with relative paths. So we're first developing our admin side in .NET and We've got our regular site and then we've got another site within that for our .NET admin tools. By site, I'm referring to them actually being Sites in IIS. This also creates a problem for us when we're creating tools that upload images and want to store them and access them from our parent Site. I'd very much appreciate any advice on how to go about this in the most standardized way. So what I'm hoping for is advise on: -Publishing and managing a site/project in Team Foundation Server. Being able to push up one fix at a time if needed would be GREAT! -Any help figuring out the issuing referencing folders from my .NET child site to my parent ColdFusion site using regular relative paths. "/a/images/b/" would be nice nice instead of only being able to do "/b/images/" We're using ColdFusion 8, C# Asp.NET 4.0/Entity Framework/POCO Templates, and a Windows 2008 R2 Server. Thank you in advance for any help.

    Read the article

  • Oracle Loader for Hadoop 1.1.0.0.3

    - by mannamal
    We are pleased to announce availability of Oracle Loader for Hadoop 1.1.0.0.3, containing bug fixes and performance improvements to Oracle Loader for Hadoop. The updated product can be downloaded from here: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/big-data-downloads-1451048.html Note that the Oracle Loader for Hadoop 1.1.0.0.3 kit is a complete kit containing the product and bug fixes. Fixes of the earlier version 1.1 patch releases are also included. Upgrading to Oracle Loader for Hadoop 1.1.0.0.3 (from versions 1.1.x): On the Oracle Big Data Appliance:  1. 1.  Upload the new oraloader rpm to the first Oracle Big Data Appliance server.  For example:   /tmp/oraloader-1.1.0.0.3-1.x86_64.rpm 2.     As the root user, use dcli from the first Oracle Big Data Appliance server to copy the new rpm to all nodes. For example:   #dcli -f /tmp/oraloader-1.1.0.0.3-1.x86_64.rpm  -d /tmp/oraloader-replace.rpm 3. 3.  As the root user, use dcli from the first Oracle Big Data Appliance server to replace the old oraloader rpm with the new one.  For example:  #dcli "rpm -e oraloader ; rpm -Uvh /tmp/oraloader-replace.rpm" On other hardware: 1. 1.  Unzip oraloader-1.1.0.0.3.x86_64.zip at <location of install> 2. 2.  Update OLH_HOME to point to <location of install>/oraloader-1.1.0.0.3 

    Read the article

  • Ubuntu One has high CPU usage but no syncing

    - by Peter
    over the weekend I updated my computer to Windows 8. So far Ubuntu One was running smoothly, but ever since the update (clean, new install) Ubuntu doesn't sync any more. In Windows 7 it would start to sync at full internet speed as soon as I drop a file. But now in Windows 8, as soon as I drop a new file into the Ubuntu One folder, CPU usage goes up to about 50 % and no network traffic occurs. This stays like that for a couple of minutes, CPU usage goes back to normal and then the client says that all is in sync - which isn't true. Is it too early for Windows 8? Do others have the same problem or is there something I can do about it? I try a couple of different things, and realized that if the file size is 20 MB the files get uploaded. The original file was 1.5 GB. I also didn't work with 200, 100 and 50 MB large files. But even with 20 MB large files, the upload is very slow and not steady. The log give plenty of this error: - twisted - ERROR - Failure: exceptions.TypeError About which I don't know the meaning. By the way, the account works just fine on the Ubuntu 12.04 partition. Any help is greatly appreciated. -Peter

    Read the article

  • Restrictive routing best practices for Google App Engine with python?

    - by Aleksandr Makov
    Say I have a simple structure: app = webapp2.WSGIApplication([ (r'/', 'pages.login'), (r'/profile', 'pages.profile'), (r'/dashboard', 'pages.dash'), ], debug=True) Basically all pages require authentication except for the login. If visitor tries to reach a restrictive page and he isn't authorized (or lacks privileges) then he gets redirected to the login view. The question is about the routing design. Should I check the auth and ACL privs in each of the modules (pages.profile and pages.dash from example above), or just pass all requests through the single routing mechanism: app = webapp2.WSGIApplication([ (r'/', 'pages.login'), (r'/.+', 'router') ], debug=True) I'm still quite new to the GAE, but my app requires authentication as well as ACL. I'm aware that there's login directive on the server config level, but I don't know how it works and how I can tight it with my ACL logic and what's worse I cannot estimate time needed to get it running. Besides, it looks only to provide only 2 user groups: admin and user. In any case, that's the configuration I use: handlers: - url: /favicon.ico static_files: static/favicon.ico upload: static/favicon.ico - url: /static/* static_dir: static - url: .* script: main.app secure: always Or I miss something here and ACL can be set in the config file? Thanks.

    Read the article

  • Joomla 1.6 site cannot add a new extension through admin interface

    - by Ghlouw
    I'm having a very frustrating problem with my Joomla 1.6 site. I cannot add any new extensions through the admin interface. I have tried to upload the extension, or to use the search folder option or even the direct link. Neither of these options work, and all that happens is that the page tries to load forever until it finally timesout with a blank white page (no further error messages). I have tried this with multiple browsers (Chrome,FF,IE) and I have tried it with different extensions (modules, components, templates - all the same result). So I don't think it has anything to do with what I am uploading, but more likely the problem is something with the post action. I have also seen the exact same error occur when I try to update menu items or even create new menu items. I am not getting this error with a duplicate of the site in the dev environment, but only get this on my shared web hosting live server. This is on a Windows IIS / PHP / mySQL environment. Any help would be much appreciated!

    Read the article

  • Chrome refused to execute this JavaScript file

    - by TestSubject528491
    In the head of my HTML page, I have: <script src="https://raw.github.com/cloudhead/less.js/master/dist/less-1.3.3.js"></script> When I load the page in my browser (Google Chrome v 27.0.1453.116) and enable the developer tools, it says: Refused to execute script from 'https://raw.github.com/cloudhead/less.js/master/dist/less-1.3.3.js' because its MIME type ('text/plain') is not executable, and strict MIME type checking is enabled. Indeed, the script won't run. Why does Chrome think this is a plain text file? It clearly has a .js file extension. Since I'm using HTML5, I omitted the type attribute, so I thought that might be causing the problem. So I added type="text/javascript" to the <script> tag, and got the same result. I even tried type="application/javascript" and still got the same error. Then I tried changing it to type="text/plain" just out of curiosity. The browser did not return an error, but of course the JavaScript did not run either. Finally I thought the periods in the filename might be throwing the browser off. So in my HTML code, I changed all the periods to the URL escape character %2E: <script src="https://raw.github.com/cloudhead/less%2Ejs/master/dist/less-1%2E3%2E3.js"></script> This still did not work. The only thing that truly works (i.e. the browser does not give an error and the JS successfully runs) is if I download the file, upload it to a local directory, and then change the src value to the local file. I'd rather not do this since I'm trying to save space on my own website. How do I get Chrome to recognize that the linked file is actually a JavaScript type?

    Read the article

  • How to set an old view? [closed]

    - by xan
    Possible Duplicate: How to revert to GNOME Classic? is it possible to change general view in Ubuntu 12.04 so it was like in Ubuntu 10.04? I really liked old one and really don't like new one. Things that are annoying me: menu on the left with large icons lack of workspaces icons in right bottom corner lack of menu: Applications, Places, System in left top corner menu like: file | edit | view | help resize and red cross is available in all applications at the very top of the pulpit, unless at the very top of the application window I know it may sound crazy, but that's all me, don't like changes despite that I know they are often good. I would be very grateful for any help. I really want old view (exactly like here: http://upload.wikimedia.org/wikipedia/commons/2/2d/UbuntuMaverickDesktop.png) back. By the way, if I installed Ubuntu 12.04 by WUBI from Windows level on separated partition, then what is the easiest way to uninstall it? Can I simply format this partition? I'm wondering because I don't know if it is safe, what about boot loader etc.

    Read the article

  • Automatic generate code: "derived work"?

    - by Peregring-lk
    For example, I've GPL software. I'm the author of this GPL software. This GPL software has, between its code, Doxygen comments. These Doxygen comments are written to generate a CC-BY-SA html page, in order to upload this generated documentation in my project website under CC-BY-SA license. But, the Doxygen documentation output is a "derivate work"? After all, this documentation is based on my GPL source code. In this case, the documentation must be GPL. But, I want the documentation is CC-BY-SA, because it is documentation. GFDL doesn't help. GPL code can't become GFDL (the opposite yes). If this output is really a derivate work, I think, creates a strange situation, because, if I distribute my work, the recipient users can't legally distribute the generated documentation: while with my work I can do I want, the users don't, thus, they have to distribute any derivated work with the same license I offer them. What is the solution?

    Read the article

  • Web development and tips for building a website and the advantages of using HTML 5 in the site

    - by Siddarth
    I am trying to make a website for my college, and the program starts from jan 13 and we get 15 days of time to develop a running site. The best site will become the college site. I am participating, for all these days i used to participate in C and C++ contests and also won a few contests, now i am really into web dev for the last 2 months. I knew HTML long ago recently i brushed up on it and learnt javascript from "javascript and jquery the missing manual"(sorry for not adding the link) and recently bought "PHP and MySQL web development" and I am going on fine with it, but still a lot of pages to cover in that book. After this what do i need to know ajax is one language to concentrate on, what else do i need to do to make this project up and running. Can someone let me know the tricks of this trade and complete information to build a site like this. Right now i am good with javascript HTML and CSS and thats it, what else I am studying HTML5 and CSS3 its pretty fast and neat. The info on site is a college website which includes students profiles where the have to register their info with college id number and pretty much thats it. Think of it as a college site + a social networking site for students, where they can upload there pics and videos pdf books etc.

    Read the article

  • What are some efficient ways to set up my environment when working on a remote site?

    - by Prefix
    Hello fellow Programmers, I am still a relatively new programmer and have recently gotten my first on-campus programming position. I am the sole dev responsible for 8 domains as well as 3 small sized PHP web apps. The campus has its web environment divided into staging and live servers -- we develop on the staging via SFTP and then push the updates to the live server through a web GUI. I use Sublime Text 2 and the Sublime SFTP plugin currently for all my dev work (its my preferred editor). If I am just making an edit to a page I'll open that individual file via the ftp browser. If I am working on the PHP web app projects, I have the app directory mapped to a local folder so that when I save locally the file is auto-uploaded through Sublime SFTP. I feel like this workflow is slow and sub-optimal. How can I improve my workflow for working with remote content? I'd love to set up a local environment on my machine as that would eliminate the constant SFTP upload/download, but as I said there are many sites and the space required for a local copy of the entire domain would be quite large and complex; not to mention keeping it updated with whatever the latest on the staging server is would be a nightmare. Anyone know how I can improve my general web dev workflow from what I've described? I'd really like to cut out constantly editing over FTP but I'm not sure where to start other than ripping the entire directory and dumping it into XAMP.

    Read the article

  • Is Wordpress more appropriate than Magento/Opencart for site like this?

    - by Alex
    The premise of the site is that a user pays a small fee to advertise an item that they want to sell. Therefore the user is responsible for adding the "products", not the administrator. The product upload will create a product page for that item. This is a rather common framework that I'm sure you're familiar with. My initial thought was that it would be best suited using Magento - mainly because it needs to accept payments - and the products will grow to form a catalog of categorized products. However - there is no concept of a shopping cart. A buyer does not buy the item online, or go to a checkout. They simply look at the product, and contact the seller if they like it. The buyer and seller then take it from there. For this reason, I then begin to suspect that Magento is perhaps too overkill, or just simply not the right CMS if there is on checkout procedure (other than the uploader making a payment) So then I begin to think Wordpress....Hmmm Feature requirements: User's can add content via a form process User's can be directed to a payment gateway For each product listing - a series of photographs shall be displayed, in thumbnail form Zoom capabilities/rotate on the images would be a welcome feature In short - e-commerce CMS, or something more simple?

    Read the article

  • How do I remove a malformed line from my sources.list?

    - by eminencejae
    I have unistalled and reinstalled the Ubuntu Software Center as per info I found in a similar thread and I got the same response about line 91 or something like that. I just tried to upload a screen shot but since I'm new it won't allow me to. I also can not figure out how to cut and paste anything so I have to hand type what the error screen says, both when I attempt to open the software center and nothing happens, when I try to enter commands into the terminal to uninstall, reinstall, whatever I get the same following: COULD NOT INTITIALIZE THE PACKAGE INFORMATION An unresolvable problem occured while initializing the package information Please report t:his bug against the 'update-manager' package and include the following error message: 'E: Malformed line 91 in source list/etc/apt/sources.list (dist parse) E: The list of sources could not be read., E: The package list of status file could not be parsed or opened. How do I report bugs? What can be done about this. I have searched and everything everyone says to do leads me back to the same line error message. So, I don't know how to get to line 91 in the source list; to tell you what it says. Sorry, I'm really new to this. That is what I need is to find out how to get there and fix what it says. I would really like to NOT have to re partition my hard drive and start from scratch, so I'm really looking forward to getting this problem solved. I need to be able to install new software.

    Read the article

  • List all BPM Processes for a user

    - by kasriniv
    Hello, Happy to start contributing to this blog..  The title of the blog is probably deceptively simple and warrants an elaboration. Customized BPM workspaces/user interfaces are a fairly common requirement. One of our marquee customers in the online stock trading business, envisioned this user interaction for their BPM application: User logs in to the internal portal Use will have list of roles which he is granted as a drop down list Once user selects the role, a list of processes which user is part of appear. Logged in user can be part of any swimlane role of the process This can be a fairly common/reasonable user-UI interaction pattern. 1. and 2. are easily achievable and hence the subject matter of this blog is the requirement in 3. Objective: Given a username and a role, list all the BPM processes that the user is part of, in any swimlane of any process. Here is quick overview of the major steps/logic in the code: Intialize workflow/BPM  context as usual Get a handle on InstanceQueryService(getInstanceQueryService), InstanceManagementService,        ProcessMetadataService and ProcessModelService List all Processes for that bpmcontext (listProcessMetadataSumary) and get Granted roles to that user For each of the processes [method  getAccessibleProcesss(ProcessMetadataSummary, Set)]for each of the lanes in the process, check if the role granted to the user, matches the roleName for that swimlane. If so, add to output. Notes: The usual caveats apply including BPM APIs are subject to change.  JDeveloper method introspection is your better friend than API documentation :-)... (I am going to try upload the source code  and if it doesnt work, will follow this blog up with the corresponding source code.) Hope this helps.  Ack: Yogesh K, BPM Dev team.

    Read the article

  • YouTube custom thumbnails feature availability

    - by skat
    I've been trying to figure out this on my own for weeks, but now I give up. 'Custom thumbnails' feature on YouTube is such a controversial one, it was changed so much... so much that even FAQ on YouTube doesn't fully describe it's features (as I see). I have a YouTube channel for one of my websites. This YouTube channel is main marketing force for my website - it brings all the boys to my yard (I mean, website). So I have to use all the hacky-tricky stuff to increase my visibility on youtube. And damn, those custom thumbnails are giving me hard times... As far as I understand, this is current state of 'custom thumbnail' feature: "If your account is in good standing, you may have the ability to upload custom thumbnails for your video uploads." (c) https://support.google.com/youtube/answer/138008 My channel has good standing, has more than 50000 views. So why the hell my account is still not eligible for this feature? anyone have any idea?

    Read the article

  • Is file permission secured when it transferred from Ubuntu to Windows?

    - by Gaurav_Java
    I am having 9GB text file which is encrypted . This file contains some confidential data . Which is on my system(Ubuntu) and my external HDD (ntfs) . This file get daily updated and then encrypted . But it has to be shared among 2-3 (Windows) person. I defined permission so that no other person can even read this file(chmod 660). It is too large file, so I can't upload it anywhere and it get updated daily basis. But this file travel on Windows OS and Ubuntu also. Even I am having copy of this on my personal computer. Recently it was deleted by some other user over Windows . I just want to know how can I set permission over that file so that it cannot be deleted from any other operating system. If someone delete this file, then I am having data old for couple of days, which is only on my system. I gone through this question it says there is nothing. And from this question I am not able to understand how can I protect it. Can I do anything for preventing this file from being deleted. Then how can I secure this files from getting deleted any suggestion or software or ideas. Maybe I sound silly or this is stupid question. Please don't close it, thanks for any suggestion or solution.

    Read the article

  • is there a checklist that a small website should

    - by Mecon
    I am not a web developer - this would be my first foray. I can do HTML/CSS/Javascript, but never created a website for a company. If anybody is creating a site for small company (expecting some 10-15 static pages), what kinda things would it need to have? I am thinking of the following: Make the eventual owner buy in his/her name: Domain name, Web Hosting package and Email package. Q - DO Web Developers generally ask their clients to buy this stuff and then ask them to share their passwords? OR - Do Web Developers ship the source files to clients so that they can upload it? Create Cross Browser compatible HTML+CSS+javascript pages Add SEO stuff like Meta tags and xml file for crawler Buy professional images from stock website Q - IS there is a best-practice for this step? Add Copyright stuff. Q - ANY idea about how to do this? Add Faceboook widgets, so people can 'like' my website. Register website somewhere so that its searchable from multiple search-engine/yellopages. Q - DOES such a thing exist? Please check my checklist :) and let me know what you think could be missing? Thanks!

    Read the article

  • Which is the best image hosting site for hosting images for website? [closed]

    - by rahul dagli
    I currently have a website and blog and using a limited web hosting plan. When I upload images on my hosting server it consumes a lot of bandwidth and space. So I was thinking of hosting images on some-other image hosting site and direct linking it to my site. I found out few sites like imageshack, photobucket, tinypic, imgur. However, I see all have certain restrictions. The features i am looking for are as follows: 1. At least 10gb space 2. At least 500gb bandwidth (bec I hav very high traffic) 3. Very high speed even during heavy load like 1000 visitors accessing every hour. 4. Ultra reliable servers (99.9% uptime) 5. Privacy control 6. Must not ever delete image if inactive 7. Create and manage albums 8. Company that will last long in business atleast for next 10 years. 9. Free of cost 10. Hotlinking/ Directlinking image.

    Read the article

  • Amazon Careers website - are resumes processed in plain text format only?

    - by sapphiremirage
    The submission site has the following options: "Please upload your resume (Word Document, max size: 512 KB) OR Please copy and paste the text version of your file here", with a text box below the latter option. I went ahead and uploaded my shiny LaTeX resume (as a PDF), despite the fact that they seem to want a Word Document, and there didn't seem to be any issues. However, when I went back to edit my profile, there was no evidence that my PDF had been uploaded, other than a text version of my resume, awfully formatted and clearly stripped from the PDF, sitting in the text box below "Please copy and paste the text version of your file here". Exasperated, I did a quick and dirty copy of the text from my resume into a Word doc and uploaded that. Same result: no evidence of a file uploaded, just a stripped text version in the textbox. What I'm wondering now is, are they only going to look at the text version of my resume? If that's the case then I'm obviously going to edit it so that it looks halfway decent and doesn't contain such atrocities from the conversion as "Other Skills: LTEX". I can pretty little text files without too much effort, so this isn't that big of deal. However, my LaTeX resume is going to look better than anything I can do in plain text, so if the site is actually keeping a copy of that, then I certainly don't want to override it. Has anyone here either gone through the Amazon hiring process or interviewed candidates and know how this works? (i.e. When on site with Amazon, did the interviewers have diversely formatted resumes, or did they all look suspiciously similar)

    Read the article

  • Need a Quick Sure Method to Produce a Formatted Explain Plan? This will help!

    - by user702295
    Please use the following on the production machine to get formatted explain plan and sql trace using the SLOW sql (e.g. 'T_COMB_LIST.COMB_ID = 216') or any other value that takes longer: -- Open new session is SQL*Plus */ -- Make sure you are using updated PLAN_TABLE -- This can be done by dropping it and recreate it by running: -- SQL> @?/rdbms/admin/utlxplan.sql) set lines 1000 set pages 1000 spool xplan_1.txt EXPLAIN PLAN FOR <<<<Replace this line with exactly the same query you used above. Force hard parse by modifying the case of a character>>>> @?/rdbms/admin/utlxplp spool off EXIT --Open a second session is SQL*Plus ALTER SESSION SET max_dump_file_size = unlimited; ALTER SESSION SET tracefile_identifier = '10046'; ALTER SESSION SET statistics_level = ALL; ALTER SESSION SET events '10046 trace name context forever, level 12'; <<<<Replace this line with exactly the same query you used above. Force hard parse by modifying the case of a character>>>> select 'verify cursor closed' from dual; ALTER SYSTEM SET EVENTS '10046 trace name context off'; EXIT Make sure spooled file is formatted properly and that the 10046 trace has relevant explain plan in it.  Please Upload both files (10046 trace is generated in udump). Need instructions to find udump?   sqlplus "/ as sysdba" show parameters dump_dest This will show you bdump, cdump and udump locations.

    Read the article

  • Webmatrix The Site has Stopped Fix

    - by Tarun Arora
    I just got started with AzureWebSites by creating a website by choosing the Wordpress template. Next I tried to install WebMatrix so that I could run the website locally. Every time I tried to run my website from WebMatrix I hit the message “The following site has stopped ‘xxx’” Step 00 – Analysis It took a bit of time to figure out that WebMatrix makes use of IISExpress. But it was easy to figure out that IISExpress was not showing up in the system tray when I started WebMatrix. This was a good indication that IISExpress is having some trouble starting up. So, I opened CMD prompt and tried to run IISExpress.exe this resulted in the below error message So, I ran IISExpress.exe /trace:Error this gave more detailed reason for failure Step 1 – Fixing “The following site has stopped ‘xxx’” Further analysis revealed that the IIS Express config file had been corrupted. So, I navigated to C:\Users\<UserName>\Documents\IISExpress\config and deleted the files applicationhost.config, aspnet.config and redirection.config (please take a backup of these files before deleting them). Come back to CMD and run IISExpress /trace:Error IIS Express successfully started and parked itself in the system tray icon. I opened up WebMatrix and clicked Run, this time the default site successfully loaded up in the browser without any failures. Step 2 – Download WordPress Azure WebSite using WebMatrix Because the config files ‘applicationhost.config’, ‘aspnet.config’ and ‘redirection.config’ were deleted I lost the settings of my Azure based WordPress site that I had downloaded to run from WebMatrix. This was simple to sort out… Open up WebMatrix and go to the Remote tab, click on Download Export the PublishSettings file from Azure Management Portal and upload it on the pop up you get when you had clicked Download in the previous step Now you should have your Azure WordPress website all set up & running from WebMatrix. Enjoy!

    Read the article

  • Printer Canon MP540 succesfully added finally but doesn't print? (Attached Debug log)

    - by NES
    i tried to setup my printer in Ubuntu 10.10. I had to use special guide to install because Canon used libcupsys2 in its packages and ubuntu expects libscupsys i followed this guide in reference to this advice. The first problem was this one (the ubuntu "add printer dialog asked for root credentials". The suggested workarout in Launchpad to create a root password worked. Now i added the printer. It's available for printing. Then i got an error message "cups insecure filter" which prevented me from printing. That could be solved by setting the need root rights in the /usr/lib/cups/filter/ directory. The error message disappeared after restarting cups service. Now it should work but it doesn't. The main problem is now, the printer seems to be proper setup but when i try to print a document, the printer icon appears for short time in gnomepanel. There's a printing job in Queue which got completed, but the printer doesn't print. I attached the Debuglog provided by printer error control, had to upload to another site, since it was to big in the question body here. Perhaps someone can identify the problem with it? Note: i know that it once worked fine with an older release of Ubuntu, but not sure which version this was.

    Read the article

  • public_html permissions for local development

    - by maGz
    I know this question has popped up a couple times, but I can't seem to find a definitive answer to my issue, so please bear with me. I have Ubuntu Server 12.04 setup in VirtualBox for PHP development and testing (Drupal plus other PHP sites using Yii framework). My question is in 3 parts... 1) If I create a public_html folder under /home/myuser, do I need to give ownership of that folder to the Apache www-data group? If so, are there any specific permissions I should be setting? 755? (Btw, I am following this guide to create the public_html directory and set up multiple virtual hosts per site I create and test) I previously had all of my sites under /var/www, but ran into massive permission denied errors whenever I tried to sFTP to it, either through FileZilla or PhpStorm. This is what I had previously done: sudo chgrp www-data /var/www sudo chmod -R 775 /var/www sudo chmod -R g+s /var/www sudo usermod -G www-data [my_ftp_user] 2) The second part of my question is this: If I create my PHP project and files in Windows through PhpStorm, and then upload via sFTP, will permissions get affected? 3) Once I am satisfied with my developed project, would it be advisable to move and test them under /var/www to see how it would fair in a production-ish environment? I would really appreciate the help and advice here. I'm learning more as I go along, but dealing with Linux files and permissions is a bit of a new ballgame for me! Thank you

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >