Search Results

Search found 4686 results on 188 pages for 'folders'.

Page 65/188 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • Where should common static resources (images, js, css, etc) go in DotNetNuke?

    - by Joosh21
    Is there a recommended location to store static resources (images, css, js, etc) in a DotNetNuke 5.x installation? There are /images and /js folders as well as a /Resources folder that contains resources. There appears to be some overlap as MicrosoftAjax.js is in multiple locations (but might be different versions?). I also could put resources in a /DesktopModule/ModuleX location. Does anyone know if there is a difference in using any of these folders? I kinda like the idea of all static resources being under a common folder (/Resources) so I could set caching rule headers, permissions, etc on them in one place. Has anyone used a separate image server to serve DotNetNuke static content? http://stackoverflow.com/questions/913208/pros-and-cons-of-a-separate-image-server-e-g-images-mydomain-com

    Read the article

  • FileSystemWatcher surpassing Active Directory restrictions

    - by DevexPP
    While experimenting with FileSystemWatcher, I've found out that it somehow surpasses Active Directory's restrictions to files and folders, and will raise change events with information about what has changed in files and folders that you don't even have access to. I have two questions about that: 1) Why does this happen ? 2) Is this a problem in the AD configuration ? how do I fix it ? 3) Is there any way to gather these files, or even create a FileSystemInfo of them to get more info about the files (not only the changes made on them) ? As far as I've tried, only the FileSystemWatcher immune to the restrictions, I can't run any other thing over it, here's a list of what I've tried: File.Exists Directory.Exists FileInfo instance on found files DirectoryInfo instance on found files File.Copy File.Delete

    Read the article

  • Filter Empty Directories in Package Explorer View

    - by Matt
    Is there a way in eclipse to filter/hide empty directory trees in the package explorer view? This is different than filtering directories like '.svn' or maven's target, or filtering empty packages. It's more trying to clean up empty directories trees that show up as a result of filter rules. Context- We have a generic project in our workspace that uses filters to ignore non text based files(mp3s, jpgs, etc). It allows us to quickly edit our files in eclipse. The problem is because of the filters there are a lot of empty folders present. If eclipse can ignore any empty folders due to filters it would make the project cleaner.

    Read the article

  • another file_exists with special chars problem

    - by Camran
    I have some folders with special characters in their names. I run currently at a test-computer with Windows OS, but later I will use LINUX. My problem is that the folders with special chars in their names cannot be recognized somehow. ex: file_exists('../Bilar/27733691_1.jpg') // TRUE file_exists('../Båtar/27733691_1.jpg') // FALSE because of the special char in folder name... How should I solve this? I plan to run LINUX in the future when website is online... would that matter? Please explain thoroughly because I am a newb at this Thanks

    Read the article

  • Xcode File management. What is best practice?

    - by ian1971
    I've been using Xcode for a while now. One thing that always bugs me is the way it handles files. I like to have my files all in nested folders rather than one big physical folder, but when you create a group in Xcode by default it does not create a folder just a virtual folder within the project. I can see that virtual folders are great for linking code in arbitrary places into your project but once you get beyond a few classes I find the one big folder approach really painful. And then if you try to fix it later it takes ages and is easy to break your build. Is it possible to change this behaviour so that by default it creates a physical folder? Or am I doing it wrong and trying to cling to some other way of working? How do other people work with files in Xcode?

    Read the article

  • dependent drop down list

    - by sushant
    i want to make a two drop down lists. the first list has static data( folder structure), so i can use an array for it,and depending on the folder or option selected in the first list, the second list shows the sub-folders in it. but the sub-folders keeps on changing, so i have to use asp fso for it. i am using the following fso code: <%@ Language=VBScript ENABLESESSIONSTATE = False% <% Dim fso, folder, files Set fso=Server.CreateObject("Scripting.FileSystemObject") Set folder=fso.GetFolder("D:\") Set files=folder.SubFolders For each folderIdx In files Response.Write("<option>" + folderIdx.Name + "</option>") Next % i dont know how to make such a dependent list. any help is really appreciated. and i am sorry for the formatting issue

    Read the article

  • which MVC for coldfusion?

    - by mrjayviper
    newbie to MVC here. so please be gentle. I want to move one of our existing apps (currently located in http://www.companywebsitegoeshere.com/myapp1) to MVC. As can be seen in the URL, there are many apps running in the website which I intend to move at some stage. I don't have write access to the wwwroot (/var/www/html in my case) and the webserver is used by multiple developers across the company. I'm hoping I can have the MVC framework core files/folders + my app all located in in 1 subfolder. In the case of myapp1, all the filers and folders will be inside /var/www/html/myapp1 subfolder. Can you please point me to the right direction (links/guides/docs/videos/etc)? I've looked at several like cfwheels/mach-ii/fw1 but they seemed requires wwwroot access. Thanks! :)

    Read the article

  • automatically rewrite URLs in ASP.NET

    - by Ali_dotNet
    I use VS2010,C# to develop an ASP.NET web site, my customers want me to have their pages like this: mysite.com/customer (in fact they call mysite/customer/default.aspx) so I've manually created several folders for each customer, and inserted a default.aspx file in the folder so that users can view customer page by typing mysite.com/customer is there a better way for performing this scenario? I don't want to have mysite.com/customer1.aspx, I want to have mysite.com/customer1, is there anyway that I can remove folders (and their containing default.aspx files) and generate something automatic using my customers database? should I use URL rewriting? is there anyway that I can create page mysite.com/customer1.aspx, and users can view it by typing mysite.com/customer1? I think it is possible to rewrite URLs in web.config, but I don't want to do it manually in web.config as my pages would increase in a daily basis thanks

    Read the article

  • FTP .NET Sockets

    - by Jojo
    Hi Everyone, I have an FTP auto downloader program. It downloads new files from a given FTP folder. The application was successful for some FTP folders that i have tested. These folders contain 30 - 50 files. However, when i tried an FTP folder with 150 and 18000 files, i receive this error message: An established connection was aborted by the software in your host machine. My first assumptions will be because of firewalls or anti virus. I don't have administrative access to this computer so I would like to ask if there are other reasons for this before i raise this to our systems dept? Need anyone's help asap. Thanks :)

    Read the article

  • Is it possible to have a .NET route that maps to the same place as a directory?

    - by Austin
    I'm building a CMS using WebForms on .NET 4.0 and have the following route that allows URLs like www.mysite.com/about to be mapped to the Page.aspx page, which looks up the dynamic content. routes.MapPageRoute("page", "{name}", "~/Page.aspx"); The problem is that I have a couple of folders in my project that are interfering with possible URLs. For example, I have a folder called "blog" where I store pages related to handling blog functionality, but if someone creates a page for their site called "blog" then navigating to www.mysite.com/blog gets the following error: 403 - Forbidden: Access is denied. You do not have permission to view this directory or page using the credentials that you supplied. Other similar URLs route correctly, but I think because .NET is identifying /blog as a physical location on the server it is denying directory access. Is there a way to tell IIS / .NET to only look for physical files instead of files and folders?

    Read the article

  • need a code snippet to find all *.html under a folder in nodejs

    - by Nicolas S.Xu
    I'd like to find all *.html files in src folder and all its sub folders using nodejs. What is the best way to do it? var folder = '/project1/src'; var extension = 'html'; var cb = function(err, results) { // results is an array of the files with path relative to the folder console.log(results); } // This function is what I am looking for. It has to recursively traverse all sub folders. findFiles(folder, extension, cb); I think a lot developers should have great and tested solution and it is better to use it than writing one myself.

    Read the article

  • Password protect web pages on Windows CE 6

    - by Chris
    I am using the default web server for WinCE 6 and wish to password protect certain folders. The default VROOT /remoteadmin/ is password protected, and this works but my configuration doesn't work. I have tried mimicking these settings on my own folders but to little success. Here is how one looks: In the HKLM\Comm\HTTPD\VROOTS key I have created a subkey called /web/configuration (this folder actually exists on the box). The following values are in this key A = 1 DefaultPage = config.html Path = /hard disk/webroot/web/configuration/ UserList = ADMIN This is nigh on identical to the settings in /RemoteAdmin/ but /RemoteAdmin/ requests a password and /web/configuration doesn't (even after reboot).

    Read the article

  • Analyze VS2010 C# projects and report files on disk not part of the projects?

    - by Lasse V. Karlsen
    I discovered earlier tonight that files and folders I have removed from my C# projects are apparently still on disk, even though my Visual Studio Mercurial plugin seems to do a good job of deleting them when I delete them in Visual Studio. It must have hickuped when it came to these files. So I wondered... Does anyone have a script or similar, or know of something, that will look at my .csproj files and report extra files and folders on my disk that isn't part of the project files? I just want to clean up my repository contents.

    Read the article

  • @echo off in DOS (cmd)

    - by Rayne
    I'm trying to write a BAT script and I have the following: @echo off REM Comments here SETLOCAL ENABLEDELAYEDEXPANSION set PROG_ROOT=C:\Prog set ONE=1 echo 1>> %PROG_ROOT\test.txt echo %ONE%>> %PROG_ROOT\test.txt for /f "tokens=*" %%f in (folders.txt) do ( echo %%f>> %PROG_ROOT\test.txt ) ENDLOCAL My folders.txt contains the number "5". My test.txt output is ECHO is off ECHO is off 5 I don't understand why the first 2 lines of output has "ECHO is off", while the third line is printed out correctly. How do I print the correct output?

    Read the article

  • Find -type d with no subfolders

    - by titatom
    Good morning ! This is a simple one I believe, but I am still a noob :) I am trying to find all folders with a certain name. I am able to do this with the command find /path/to/look/in/ -type d | grep .texturedata The output gives me lots of folders like this : /path/to/look/in/.texturedata/v037/animBMP But I would like it to stop at .texturedata : /path/to/look/in/.texturedata/ I have hundreds of these paths and would like to lock them down by piping the output of grep into chmod 000 I was given a command with the argument -dpe once, but I have no idea what it does and the Internet has not be able to help me determine it's usage Thanks you very much for your help !

    Read the article

  • Customizing Document library web part

    - by Sushant
    Hi, I am developing a website in Sharepoint 2007. I came across a bit tricky problem. I have a document library web part in a web part page. I am using the summary toolbar view. I dont want users to add new documents on first screeen where it shows folders. I want them to open the folders and add documents there. I cannot do a No toolbar view because it will remove the link from every subsequent page. Has anyone implemented something like this. Please help.

    Read the article

  • jQuery update link

    - by Happy
    Here is html: <a href="http://site.com/any/different/folders/picture_name.jpg">Go and win</a> <a href="http://site.com/not/similar/links/some_other_name.png">Go and win</a> How to add some text after last "/" in href attribute (before picture_name.jpg) of each link? The script should give something like: <a href="http://site.com/any/different/folders/user_picture_name.jpg">Go and win</a> <a href="http://site.com/not/similar/links/user_some_other_name.png">Go and win</a> Here user_ is added. There can be any length of the link.

    Read the article

  • How can I use git for a framework and for a project using that framework while keeping the project s

    - by Kevin
    We are developing a web application and the framework under it. I would like to be able to use the framework for other projects and I am even considering making the framework open source. Right now each developer has 2 separate folders, one for each. I then have a 3rd folder with symlinks to the files in the to git project folders. This works but we have pull both the framework and the app and if they get out of sync nothing works. We are going to be starting the second app using the framework soon. Is there a better way to do this?

    Read the article

  • .net File.Copy very slow when copying many small files (not over network)

    - by Guavaman
    I'm making a simple folder sync backup tool for myself and ran into quite a roadblock using File.Copy. Doing tests copying a folder of ~44,000 small files (Windows mail folders) to another drive in my system, I found that using File.Copy was over 3x slower than using a command line and running xcopy to copy the same files/folders. My C# version takes over 16+ minutes to copy the files, whereas xcopy takes only 5 minutes. I've tried searching for help on this topic, but all I find is people complaining about slow file copying of large files over a network. This is neither a large file problem nor a network copying problem. I found an interesting article about a better File.Copy replacement, but the code as posted has some errors which causes problems with the stack and I am nowhere near knowledgeable enough to fix the problems in his code. Are there any common or easy ways to replace File.Copy with something more speedy?

    Read the article

  • Long pause when accessing DFS namespace

    - by Matt
    We've recently migrated our Windows network to use DFS for shared files. DFS is working well, except for one annoying problem: users experience a significant delay when they try to access a DFS namespace that they have not accessed for some time. I have tried to troubleshoot the issue but have not had any success so far, and I was hoping someone here may have some pointers to help resolve the problem. Firstly, some background on our network: The network uses a Windows 2008 functional level Active Directory domain with two Windows 2008 DCs and two DNS servers (one on each of the DCs). The network is DNS only - no WINS. All computers are located at the same site and connected by Gigabit Ethernet. We have approximately 20 Domain-based DFS namespaces in Windows 2008 mode, and each DFS namespace has two Windows 2008 DFS namespace servers (the same two servers for all namespaces). All namespace servers are in FQDN mode and all folder targets are specified using their FQDN. All computers are up-to-date with Service Packs and patches. The actual folder targets (i.e. the SMB shares our DFS folders point to) are scattered across several file and application servers, all running Windows 2008 bar two application servers which run Windows 2003 R2, with no replication setup at all (e.g. all DFS folders currently only have one folder target). Some more detail on the problem: The namespace access delay is generally 1 - 10 seconds long and seems to occur when a particular computer has not accessed the requested namespace for approximately five minutes or more. For example, if the user has not accessed \\domain.name\namespace1\ for more than five minutes and attempts to access \\domain.name\namespace1\ via Windows Explorer, the Explorer window will freeze for 1 - 10 seconds before finally resuming and displaying the folders that exist in \\domain.name\namespace1. If they then close the Explorer window and attempt to access \\domain.name\namespace1\ again within five minutes the contents will be displayed almost instantly - if they wait longer than five minutes it will go through the 1 - 10 second pause again. Once "inside" the namespace everything is nice and snappy, it's just the initial connection to the namespace that is slow. The browsing delays seem to affect all variants of Windows that we use (Windows 2008 x64 SP2, Windows 2003 R2 x86 SP2, Windows XP Pro x86 SP3) - it is possibly a bit worse in Windows XP / 2003 than in Windows 2008, but I'm not sure if the difference isn't just psychological. Accessing the underlying folder targets directly exhibits no delay at all - i.e. if the SMB shares pointed to by DFS are accessed directly (bypassing DFS) then there is no pause. During trouble-shooting I noticed that the "Cache duration" for all of our DFS roots is set to 300 seconds - 5 minutes. Given that this is the same amount of time required to trigger the pause I assume that this caching is somehow related, although I am unsure exactly what is cached on the client and hence what needs to be looked up again after 5 minutes have elapsed. In trying to resolve the problem I have already tried / checked the following (without success): Run dcdiag on both Domain Controllers - no problems found Done some basic DNS server checks without finding any problems - I don't know how to check the DNS servers in detail, but I would add that the network is not exhibiting any other strange behavior that may point to a DNS problem Disabled Anti-virus on clients and servers Removing one of the namespace servers from a couple of namespaces - no difference So that's where I'm up to - and I'm out of ideas. Can anyone suggest what may be causing the delays and/or what I should be trying next?

    Read the article

  • How to setup linux permissions for the WWW folder?

    - by Xeoncross
    Updated Summery The /var/www directory is owned by root:root which means that no one can use it and it's entirely useless. Since we all want a web server that actually works (and no-one should be logging in as "root"), then we need to fix this. Only two entities need access. PHP/Perl/Ruby/Python all need access to the folders and files since they create many of them (i.e. /uploads/). These scripting languages should be running under nginx or apache (or even some other thing like FastCGI for PHP). The developers How do they get access? I know that someone, somewhere has done this before. With however-many billions of websites out there you would think that there would be more information on this topic. I know that 777 is full read/write/execute permission for owner/group/other. So this doesn't seem to be needed as it leaves random users full permissions. What permissions are need to be used on /var/www so that... Source control like git or svn Users in a group like "websites" (or even added to "www-data") Servers like apache or lighthttpd And PHP/Perl/Ruby can all read, create, and run files (and directories) there? If I'm correct, Ruby and PHP scripts are not "executed" directly - but passed to an interpreter. So there is no need for execute permission on files in /var/www...? Therefore, it seems like the correct permission would be chmod -R 1660 which would make all files shareable by these four entities all files non-executable by mistake block everyone else from the directory entirely set the permission mode to "sticky" for all future files Is this correct? Update: I just realized that files and directories might need different permissions - I was talking about files above so i'm not sure what the directory permissions would need to be. Update 2: The folder structure of /var/www changes drastically as one of the four entities above are always adding (and sometimes removing) folders and sub folders many levels deep. They also create and remove files that the other 3 entities might need read/write access to. Therefore, the permissions need to do the four things above for both files and directories. Since non of them should need execute permission (see question about ruby/php above) I would assume that rw-rw-r-- permission would be all that is needed and completely safe since these four entities are run by trusted personal (see #2) and all other users on the system only have read access. Update 3: This is for personal development machines and private company servers. No random "web customers" like a shared host. Update 4: This article by slicehost seems to be the best at explaining what is needed to setup permissions for your www folder. However, I'm not sure what user or group apache/nginx with PHP OR svn/git run as and how to change them. Update 5: I have (I think) finally found a way to get this all to work (answer below). However, I don't know if this is the correct and SECURE way to do this. Therefore I have started a bounty. The person that has the best method of securing and managing the www directory wins.

    Read the article

  • Exchange 2010 OWA - a few questions about using multiple mailboxes

    - by Alexey Smolik
    We have an Exchange 2010 SP2 deployment and we need that our users could access multiple mailboxes in OWA. The problem is that a user (eg John Smith) needs to access not just somebody else's (eg Tom Anderson) mailboxes, but his OWN mailboxes, e.g. in different domains: [email protected], [email protected], [email protected], etc. Of course it is preferable for the user to work with all of his mailboxes from a single window. Such mailboxes can be added as multiple Exchange accounts in Outlook, that works almost fine. But in OWA, there are problems: 1) In the left pane - as I've learned - we can open only Inbox folders from other mailboxes. No way to view all folders like in Outlook? 2) With Send-As permissions set, when trying to send a message from another address, that message is saved in the Sent Items folder of the mailbox that is opened in OWA, and not in the mailbox the message is sent from. The same thing with the trash can. Is there a way to fix that? Also, this problem exists in desktop Outlook when mailboxes are added automatically via the Auto Mapping feature, so that we need to turn it off and add the accounts manually. Is there a simpler workaround? 3) Okay, suppose we only open Inbox folders in the left pane. The problem is that the mailbox names shown there are formed from Display Name attributes. But those names are all identical! All the mailboxes are owned by John Smith, so they should be all named John Smith - so that letter recepient sees "John Smith" in the "from" field, no matter what mailbox it is sent from. Also, the user knows what's his name - no need to tell him. He wants to know what mailbox he works with. So we need a way to either: a) customize OWA to show mailbox email address instead of user Display Name, or b) make Exchange use another attribute to put in the "from" field when sending letters 4) Okay, we can switch between mailboxes using "Open Other Mailbox" in the upper-right corner menu. But: a) To select a mailbox we need to enter its name (or first letters). It there a way to show a list of links to mailboxes the user has full access to? Eg in the page header... b) If we start entering the first letters, we see a popup list with possible mailboxes to be opened. But there are all mailboxes (apparently from GAL), not only mailboxes the user has permission to open! How to filter that popup list? c) The same problem as in (3) with mailbox naming. We can see the opened mailbox email address ONLY in the page URL, which is insufficient for many users. In the left pane we see "John Smith" which is useless. 5) Each mailbox is tied with a separate user in AD. If one has several mailboxes, we need to have additional dummy AD accounts, create additional OUs to store them, etc. That's not very nice, is there any standartized, optimal way to build such a structure? We would really appreciate any answers or additional info for any of these questions. Thank you in advance.

    Read the article

  • How to setup linux permissions the WWW folder?

    - by Xeoncross
    Updated Summery The /var/www directory is owned by root:root which means that no one can use it and it's entirely useless. Since we all want a web server that actually works (and no-one should be logging in as "root"), then we need to fix this. Only two entities need access. PHP/Perl/Ruby/Python all need access to the folders and files since they create many of them (i.e. /uploads/). These scripting languages should be running under nginx or apache (or even some other thing like FastCGI for PHP). The developers How do they get access? I know that someone, somewhere has done this before. With however-many billions of websites out there you would think that there would be more information on this topic. I know that 777 is full read/write/execute permission for owner/group/other. So this doesn't seem to be needed as it leaves random users full permissions. What permissions are need to be used on /var/www so that... Source control like git or svn Users in a group like "websites" (or even added to "www-data") Servers like apache or lighthttpd And PHP/Perl/Ruby can all read, create, and run files (and directories) there? If I'm correct, Ruby and PHP scripts are not "executed" directly - but passed to an interpreter. So there is no need for execute permission on files in /var/www...? Therefore, it seems like the correct permission would be chmod -R 1660 which would make all files shareable by these four entities all files non-executable by mistake block everyone else from the directory entirely set the permission mode to "sticky" for all future files Is this correct? Update: I just realized that files and directories might need different permissions - I was talking about files above so i'm not sure what the directory permissions would need to be. Update 2: The folder structure of /var/www changes drastically as one of the four entities above are always adding (and sometimes removing) folders and sub folders many levels deep. They also create and remove files that the other 3 entities might need read/write access to. Therefore, the permissions need to do the four things above for both files and directories. Since non of them should need execute permission (see question about ruby/php above) I would assume that rw-rw-r-- permission would be all that is needed and completely safe since these four entities are run by trusted personal (see #2) and all other users on the system only have read access. Update 3: This is for personal development machines and private company servers. No random "web customers" like a shared host. Update 4: This article by slicehost seems to be the best at explaining what is needed to setup permissions for your www folder. However, I'm not sure what user or group apache/nginx with PHP OR svn/git run as and how to change them. Update 5: I have (I think) finally found a way to get this all to work (answer below). However, I don't know if this is the correct and SECURE way to do this. Therefore I have started a bounty. The person that has the best method of securing and managing the www directory wins.

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >