Search Results

Search found 16797 results on 672 pages for 'directory traversal'.

Page 376/672 | < Previous Page | 372 373 374 375 376 377 378 379 380 381 382 383  | Next Page >

  • Using branches for a mini project or module of project: Good practice?

    - by TheLQ
    In my repo I have 3 closely related mini projects: 1 server and 2 clients. They are all quite small (<3 files each). Since they are so small and so closely related I just dropped them in folders in one single repo. However now that I know I can't clone a single directory in my VCS of choice (Mercurial), I'm considering splitting them up. However I'm confused about general best practice: Is it okay to put different small projects in different branches, or should they all go in different repos? I'm currently leaning towards branching since I can't easily splice out the file history of the different projects but then your using a feature in a way it wasn't meant to be used.

    Read the article

  • asp.net website development component / APIs

    - by Haseeb Asif
    I have been assigned a new website project to work on in my organization where my role demand to finalize all the tools/technologies/controls/api etc. That website will something like online store, where every user has his online store as subdomain e.g. user1.myprojectdomain.com I have been researching a number of things to use and need your suggestions in following levels ASP.NET web forms vs Asp.net MVC: Prefering asp.net webforms due to following reason with N Tier Architecture Rapid application Development large set of Toolbox/controls And mainly due to our team skill set Errorlogging Elmah seems to be a nice library Forums Forums Yetanotherforums On line Live Chat still looking for something (Working on SignalR) Signups with Social Media Engage by Janrain And I need help that how can Manage sub domains. Do we create a Virtual Directory/application for every user in the IIS on runtime or we can do some thing else

    Read the article

  • Ubuntu 12.04 LTS 64 bit. Logitech m510 mouse not working!

    - by Alonso
    When I run lsusb, this shows up: Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 004: ID 046d:c52b Logitech, Inc. Unifying Receiver I have followed and done all of the stuff in: Ubuntu cannot detect my Logitech Wireless m510 mouse? Yet my mouse still fails to work!!! When trying to install HIDpoint this appears after entering Y: libpng does not exist libtiff does not exist Gathering System information and generating a log Launching HIDPoint Installer ./hidpointsetup: error while loading shared libraries: libpng.so.3: cannot open shared object file: No such file or directory What can I do to fix this?

    Read the article

  • Website hosting and deployment

    - by squixy
    I'm relatively new to Web Development especially when it comes to infrastructure. I have AngularJS application build and served by brunch.io locally. It uses rails-api JSON data. I'd like to deploy my angular application separately from rails server. For now, JS app is placde inside public directory of backend server and deployed together. It isn't elegant nor effective so I want to use some other hosting service. I was thinking about VPS where I could place both Angular and Ruby applications. I read about NodeJS or Nginx that can serve static files, but I don't have any knowledge or experience with these technologies. How is the best way to provide separate frontend and backend applications communicating with each other?

    Read the article

  • Migrating Ruby on Rails Website to New Server (Linux)

    - by GarytheWorm
    I have an existing website that is a Ruby on Rails project. I have another server i need to transfer the existing website too. The server i wish to transfer too was originally hosting the website so has the necessary gems/configuration are installed. I have tar the current releases shared dir from the old server and transfered them over to the new server. I have then unpack the tar in the apps directory to the new location which is a different URL path. My problem is now as you can see below that the path on the current - is pointing to the old url. ( i ran ls -la to see owenership) How can i change this current path to read with my new web address? current releases shared sitepack.tar root@server1:/var/www/clients/client1/NEWSITE.com/web/apps# ls -la current - /var/www/OLDSITE.com/web/apps/releases/20120130171636 root@server1:/var/www/clients/client1/NEWSITE.com/web/apps#

    Read the article

  • Root access issues - how do others manage this?

    - by Ciaran Archer
    Hi there I use my Ubuntu 10.04 LTS instance (via Virtual Box on Windows 7) with a non-root user. I am trying out developing Rails applications and I notice that I need to run some rails commands with sudo. The problem this gives me is some files are created by the root user then, and I cannot edit them via a GNOME window with my logged in user. What is the correct thing to do here? Should I somehow always login as root? If so how? Is there some way for me to give all files under my home directory (where I do all my Rails work) the correct permissions for all users, so I can edit them with my logged in user via a window? At the moment I have to resort to opening a file via the command line like this sudo gedit myFile.rb - this is not very sustainable! Thanks in advance!

    Read the article

  • How to solve this problem starting new Opera-windows on dual monitor-setup?

    - by Mnementh
    I use Opera and have a setup with two monitors. If I want to open a new URL opera -newwindow URL. The outcome differs, if I execute this command in a program on the same screen where Opera is, or on the other. On the same screen everything is fine. I get the message opera: Activated running instance and the new window with the URL is opened. On the other screen, I get a dialog with the following message: It appears another opera instance is using the same configuration directory because its lock file is active: /home/(my name)/.opera/lock and the URL is not shown. That's not only from console, also if I click a link in E-Mail or so. How can I fix this? Window-manager is awesome, if it does make a difference.

    Read the article

  • .bash_history and .cache

    - by John Isaacks
    I have a user who's home directory is a Mercurial repository. Mercurial notified me that there were 2 new unversioned files in my repository. .bash_history and .cache/motd.legal-displayed. I assume bash_history is the history of bash commands for my user. I have no idea what the other is. I don't want these files to be versioned by Mercurial, are they safe to just delete, or will they come back, or mess something up? Can they be moved to somewhere else? Or do I have to add them to my .hgignore file?

    Read the article

  • Game Patching Mac/PC

    - by Centurion Games
    Just wondering what types of solutions are available to handle patching of PC/Mac games that don't have any sort of auto updater built into them. In windows do you just spin off some sort of new install shield for the game that includes the updated files, hope you can read a valid registry key to point to the right directory, and overwrite files? If so how does that translate over to Mac where the game is normally just distributed as straight up .app file? Is there a better approach than the above for an already released product? (Assuming direct sells, and not through a marketplace that features auto-updating like Steam.) Are there any off the shelf auto-updater type libraries that could also be easily integrated with a C/C++ code base even after a game has been shipped to make this a lot simpler, and that are cross platform? Also how do auto-updaters work with new OS's that want applications and files digitally signed?

    Read the article

  • Downgrading from ubuntu 11.10 to 10.10, keeping installed programs

    - by Peter
    I recently upgraded from 10.10 to 11.04 then 11.10, and I'd like to revert back to 10.10. I understand that you cannot downgrade a version as easily as you can upgrade, and that I'll probably have to get the boot CD again and reinstall the whole thing. I know that I can keep most of my files by saving the /home directory, so 2 questions: Once I've gone back to 10.10, can I juts copy my old version of home over the freshly installed one? Is there a way to keep all of my installed programs, or some sort of way of getting the new install to automatically install them? Will I have to go through the tricky setups of things like TeX all over again? Thanks

    Read the article

  • Process development lifecycle in Oracle BPM 11g

    - by mesriniv
       Oracle BPM 11g platform provides two modeling tools tailored to different audience. The BPM Process Composer component is a web-based, role-driven, collaborative platform for discovery, design and documentation of business processes aimed at business audience. It empowers the business user to participate in the definition, feedback and design of business processes. The other modeling tool is Oracle BPM Studio that runs in the JDeveloper IDE .  Irrespective of the tool used, same BPMN and related artifacts are authored - that is , this is not import/export but just multiple tools working with same assets. In addition to BPMN 2.0, both tools provides editors for process data, organizational roles, human tasks (including assignment and user interface), business rules. The Oracle BPM design-time repository (Oracle Metadata Services Repository) is the glue that facilitates shared work environment across multiple BPM Composer and Studio clients.This document explains how to create snapshots and versions of your BPM projects and captures best practices for shared process development lifecycle. http://java.net/projects/oraclebpmsuite11g/downloads/directory/Samples/bpm-122-processdevelopment-lifecycle

    Read the article

  • Move Joomla website to new folder

    - by Jon
    I currently have a website. I have created a new folder on the website called V2. Under this folder I have installed Joomla and configured my new looking site. I now want to make V2 the default website. I could point the website to that V2 directory however I have other folders under the current root website that I need to keep. How can I transfer V2 to the root of my website? Is it just a case of copying all the files?

    Read the article

  • Do PHP-FPM (and other PHP handlers) need execute permissions on the PHP files they're serving?

    - by Andrew Cheong
    I read in a post at Server Fault that PHP-FPM needs execute permissions. However, the answer in When creating a website, what permissions and directory structure? only grants read and write permissions to PHP-FPM. Maybe I don't quite understand how PHP handlers (or CGI in general) work, but the two claims seem contradictory to me. As I understand, when Apache / Nginx gets a request for foobar.php, it "passes" the file to an appropriate handler. That is, I imagine it's as if www-root (or apache or whomever the webserver's running as) were to run some command, /usr/sbin/php-fpm foobar.php Actually, no, that's naive, I just realized. PHP-FPM must be a running instance (if it's to be performant, and cache, etc.), so probably PHP-FPM is just being told, "Hey, quick, process this file for me!" In either case, I don't see why execute permissions are necessary. It's not like the webserver needs to literally execute the file, i.e. ./foobar.php Is the Server Fault answer simply mistaken?

    Read the article

  • Is it safe to convert Windows file paths to Unix file paths with a simple replace?

    - by MxyL
    So for example say I had it so that all of my files will be transferred from a windows machine to a unix machine as such: C:\test\myFile.txt to {somewhere}/test/myFile.txt (drive letter is irrelevant at this point). Currently, our utility library that we wrote ourselves provides a method that does a simple replace of all back slashes with forward slashes: public String normalizePath(String path) { return path.replaceAll("\\", "/"); } Slashes are reserved and cannot be part of a file name, so the directory structure should be preserved. However, I'm not sure if there are other complications between windows and unix paths that I may need to worry about (eg: non-ascii names, etc)

    Read the article

  • download file from web source, selectively

    - by KILL3RTACO
    If anyone has heard of Bukkit, you know that their files are usually of three types: Development, Beta, and Realease. Click (here) for examples. I need a script that: Loops through the directory Gets the latest Stable version (probably just as simple as looking at the version number as they have a simple naming convention, each stable version is succeeded by -Rx.0, while developmental and beta versions are succeeded by -Rx.x) After that I know I'll need to use wget to download the file. Note: If your just going to post code, at least tell me what it does so I can use it later if I need to

    Read the article

  • Where are Nagios 3 Config Files in Ubuntu 12.04?

    - by Aaron James
    I just installed Nagios3 via Synaptic. The package and it's dependencies all installed fine and I log in using a web browser, however I'd like to add hosts now and according to the official Nagios Documentation the config file should be in the /usr/local/nagios/* directory. When I go to /usr/local it's not there. I can't seem to find these config files anywhere. I'm not sure what I did wrong. I'm running Xubuntu 12.04 64-bit Any help at all would be greatly appreciated, Thank you!

    Read the article

  • Crontab opens as blank page, cannot save

    - by Sarah
    I am really not familiar with linux, and only started using it recently, so be patient with me. I am trying to control a camera on regular intervals through a script that is called upon in the crontab. When I start up the computer, I can open crontab, edit and save, and everything is executed correctly. However, I can never open crontab a second time, unless I restart the computer first. If I type crontab -e, I get a blank page, located in the /tmp directory. I can enter my commands in there, but cannot save it. I don't know if this is relevant, but when I try sudo crontab -e, I get something like "no cron installed for root". Any help is really appreciated! Sarah

    Read the article

  • How do you forcibly unmount a disk when you press the eject button on an optical drive?

    - by Michael Curran
    When upgrading my hardware, I also upgraded to Ubuntu 10.10. On my previous system (with 10.04 and earlier) when I ejected a disk from the optical drive, the subfolder in the /media directory was automatically removed. In my new 10.10 system, if I don't eject the disk using the "eject" command within the system, the disk remains mounted, even after a new disk is installed. The new drive is a Blu Ray drive, but I haven't noticed any other problems from it. Normally, this isn't a problem, but it makes installing applications that are spread over multiple CDs more difficult in many cases (i.e. Wine). Any advice?

    Read the article

  • Unable to remove some unity lenses

    - by S Prasanth
    I removed the files, video, photos and friends lenses with the following command. sudo apt-get purge unity-lens-files unity-lens-video unity-lens-photos unity-lens-friends Although the corresponding results have disappeared from dash, only the friends tab has been removed. There still are tabs for files, video and photos, albeit empty. How do I remove these empty tabs? I use Ubuntu 13.10 Saucy Salamander. I understand that this issue didn't exist in 12.04. The directory structure of unity lenses seems to have changed from 12.04 to 13.10. Earlier the lenses were stored in /usr/share/unity/lenses/. That isn't the case now, rendering this answer inappropriate: http://askubuntu.com/a/120116/111720

    Read the article

  • After changing web host, I get a 'file does not exist' error

    - by Jordan
    I run a WordPress blog, and have recently changed web hosts. When changing web hosts, I copied all files and exported/imported the database etc as explained by lots of tutorials found easily on Google. The blog home page works fine. What goes wrong: When I click on any link from the home page, the browser gets stuck in a redirect loop. Looking at the error log, I see: File does not exist: /usr/local/apache/htdocs/index.php The directory /usr doesn't even exist for my website - so perhaps this is looking for a file that was present using my old Web Host and is no longer present with my new web host? What is going on, and how might I resolve it?

    Read the article

  • Why can't I copy files to var/www/html?

    - by Alaa M.
    I copy some files, go to /var/www/html, right click, Paste isn't available. Why is that? I tried the command sudo nautilus, it opened the files navigation window, I navigated to /html, still the same problem. I also tried gksudo nautilus but it said: The program 'gksudo' is currently not installed. You can install it by typing: sudo apt-get install gksu I installed it, tried again, same problem. What should I do? edit: Figures that I was accessing the files inside a .zip directory and I shoulda extracted them first. Solved

    Read the article

  • How to hide bind mounts in nautilus?

    - by Bazon
    Summary: How do I remove folders mounted via bind or bindfs in /etc/fstab from appearing as devices in nautilus left column, the "places" view? detailed: Hello, I mount various directories from my data partition via bind in /etc/fstab in my home directory, eg like this: #using bind: /mnt/sda5/bazon/Musik /home/Bazon/Musik none bind,user 0 0 #or using bindfs bindfs#/mnt/sda5/tobi/Downloads /home/tobi/Downloads fuse user 0 0 (Background: /dev/sda5 mounted to /mnt/sda5 is my old home partition, but I do not want to mount it as a home partition, as I always have at least 2 Linuxes on the computer ...) That works well, but since 12.10 every one of those items is listed in Nautilus in the left column under "Devices". (Where normally USB drives appear, etc.) This is a waste of space (as I have many of such mounts...) and so I would like to have these mounts hidden, just as it was before in 12.04. How can I do that? Thanks!

    Read the article

  • Setting folder to be writable by apache/php in windows?

    - by Chris Sobolewski
    I have a local test server, and I am attempting to write a file with PHP. I am getting a message that the folder (../uploads/) does not exist or I do not have permission. My directory structure is D:\xampp\htdocs\website\ //<--root D:\xampp\htdocs\website\library //<--where script runs D:\xampp\htdocs\website\uploads //<--where I'd like to save I know on a *nix server, I can just chmod the permission to 0777. What do I need to set on my windows box to give apache the ability to write a file?

    Read the article

  • How can I run samba?

    - by depesz
    I have server running Ubuntu 10.10. Never used samba before, as I never had windows machines, but now I need it. So I did: apt-get install samba smbfs smbclient. Packages are installed, but I have no idea how to configure it. All howtos I found on the net relate to /etc/samba/something.conf, where I don't even have /etc/samba directory. The only config I found is /etc/default/samba, which contains (aside from comments) only: RUN_MODE="daemons" All I want is to be able to have access to some directories on the Ubuntu machine from Windows, nothing else.

    Read the article

  • PHP accessible shared content between two websites on the same VPS on different domains/IPs

    - by Lee Fentress
    I have two ecommerce websites, selling music digital downloads, on the same VPS, currently using cPanel/WHM (but thinking of switching to Virtualmin). They have separate domains and IPs of course. They both share from the same set of music files, so I have duplicate copies in each website directory, which takes up a lot of disk space. How might I go about sharing the same set of music files across both sites, allowing PHP access, so that it does not break my shopping cart's functionality of serving customers the downloads after they have paid for them? I thought of maybe using symlinks or something, but I don't know if it's possible, or if it would have to somehow circumvent built-in security features of the server. I'm new to VPS management.

    Read the article

< Previous Page | 372 373 374 375 376 377 378 379 380 381 382 383  | Next Page >