Search Results

Search found 59767 results on 2391 pages for 'project files'.

Page 49/2391 | < Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >

  • Hosting files with support for file tagging / keywords

    - by Zev Chonoles
    I have a large (approx. 25GB) collection of files I would like to host online for people to view or download. I have a spare computer I can use as a dedicated server for these files. I'm looking for a method of, or piece of software for, hosting my files where I can assign tags or keywords to the files, and people viewing my files online can search the collection via the tags. By way of approximate solutions I've found so far, I see that there is software such as Collectorz.com or Readerware for creating databases of one's books / music / movies, and these databases can be searched by tags or keywords, and the databases can be made available and searchable online; this would suit my purposes except that my files are not necessarily books, music, or movies, and I want the files themselves accessible online, not a database describing my files. A commercially-available solution like the ones above would be acceptable, but I'd prefer to have the whole setup under my control (i.e. I'd like to either implement it by hand, or use commercial software that doesn't rely on using the company's servers, paying them a continued fee, etc.). The current extent of my internet experience is designing a few Google Sites, so I know there's a fair chance I won't understand the answers I receive, but I'm always happy to have a summer project :)

    Read the article

  • Maven Integrated View for NetBeans IDE

    - by Geertjan
    Started working on an oft-heard request from Kirk Pepperdine for an integrated view for multimodule builds for Maven projects in NetBeans IDE, as explained here. I suddenly had some kind of brainwave and solved all the remaining problems I had, by delegating to the LogicalViewProvider's node, instead of the project's node, which means I inherit all the icons, actions, package nodes, and anything else that was originally defined within the original project, in this case for the open source JAnnocessor project: Above, you can see that the Maven submodules can either be edited in-line, i.e., within the parent project, or separately, by opening them in the traditional NetBeans way. Get the module here: http://plugins.netbeans.org/plugin/45180/?show=true Some people out there might be interested in how this is achieved. First, hide the original ModulesNodeFactory in the layer. Then create the following class, which creates what you see in the screenshot above: import java.util.ArrayList; import java.util.List; import javax.swing.event.ChangeListener; import org.netbeans.api.project.Project; import org.netbeans.spi.project.SubprojectProvider; import org.netbeans.spi.project.ui.LogicalViewProvider; import org.netbeans.spi.project.ui.support.NodeFactory; import org.netbeans.spi.project.ui.support.NodeList; import org.openide.nodes.FilterNode; import org.openide.nodes.Node; @NodeFactory.Registration(projectType = "org-netbeans-modules-maven", position = 400) public class ModulesNodeFactory2 implements NodeFactory { @Override public NodeList<?> createNodes(Project prjct) { return new MavenModulesNodeList(prjct); } private class MavenModulesNodeList implements NodeList<Project> { private final Project project; public MavenModulesNodeList(Project prjct) { this.project = prjct; } @Override public List<Project> keys() { return new ArrayList<Project>( project.getLookup(). lookup(SubprojectProvider.class).getSubprojects()); } @Override public Node node(final Project project) { Node node = project.getLookup().lookup(LogicalViewProvider.class).createLogicalView(); return new FilterNode(node, new FilterNode.Children(node)); } @Override public void addChangeListener(ChangeListener cl) { } @Override public void removeChangeListener(ChangeListener cl) { } @Override public void addNotify() { } @Override public void removeNotify() { } } } Considering that there's only about 5 actual statements above, it's pretty amazing how much can be achieved with so little code. The NetBeans APIs really are very cool. Hope you like it, Kirk!

    Read the article

  • IPS Facets and Info files

    - by mkupfer
    One of the unusual things about IPS is its "facet" feature. For example, if you're a developer using the foo library, you don't install a libfoo-dev package to get the header files. Intead, you install the libfoo package, and your facet.devel setting controls whether you get header files. I was reminded of this recently when I tried to look at some documentation for Emacs Org mode. I was surprised when Emacs's Info browser said it couldn't find the top-level Info directory. I poked around in /usr/share but couldn't find any info files. $ ls -l /usr/share/info ls: cannot access /usr/share/info: No such file or directory Was I was missing a package? $ pkg list -a | egrep "info|emacs" editor/gnu-emacs 23.1-0.175.0.0.0.2.537 i-- editor/gnu-emacs/gnu-emacs-gtk 23.1-0.175.0.0.0.2.537 i-- editor/gnu-emacs/gnu-emacs-lisp 23.1-0.175.0.0.0.2.537 --- editor/gnu-emacs/gnu-emacs-no-x11 23.1-0.175.0.0.0.2.537 --- editor/gnu-emacs/gnu-emacs-x11 23.1-0.175.0.0.0.2.537 i-- system/data/terminfo 0.5.11-0.175.0.0.0.2.1 i-- system/data/terminfo/terminfo-core 0.5.11-0.175.0.0.0.2.1 i-- text/texinfo 4.7-0.175.0.0.0.2.537 i-- x11/diagnostic/x11-info-clients 7.6-0.175.0.0.0.0.1215 i-- $ Hmm. I didn't have the gnu-emacs-lisp package. That seemed an unlikely place to stick the Info files, and pkg(1) confirmed that the info files were not there: $ pkg contents -r gnu-emacs-lisp | grep info usr/share/emacs/23.1/lisp/info-look.el.gz usr/share/emacs/23.1/lisp/info-xref.el.gz usr/share/emacs/23.1/lisp/info.el.gz usr/share/emacs/23.1/lisp/informat.el.gz usr/share/emacs/23.1/lisp/org/org-info.el.gz usr/share/emacs/23.1/lisp/org/org-jsinfo.el.gz usr/share/emacs/23.1/lisp/pcvs-info.el.gz usr/share/emacs/23.1/lisp/textmodes/makeinfo.el.gz usr/share/emacs/23.1/lisp/textmodes/texinfo.el.gz $ Well, if I have what look like the right packages but don't have the right files, the next thing to check are the facets. The first check is whether there is a facet associated with the Info files: $ pkg contents -m gnu-emacs | grep usr/share/info dir facet.doc.info=true group=bin mode=0755 owner=root path=usr/share/info file [...] chash=[...] facet.doc.info=true group=bin mode=0444 owner=root path=usr/share/info/mh-e-1 [...] file [...] chash=[...] facet.doc.info=true group=bin mode=0444 owner=root path=usr/share/info/mh-e-2 [...] [...] Yes, they're associated with facet.doc.info. Now let's look at the facet settings on my desktop: $ pkg facet FACETS VALUE facet.locale.en* True facet.locale* False facet.doc.man True facet.doc* False $ Oops. I've got man pages and various English documentation files, but not the Info files. Let's fix that: # pkg change-facet facet.doc.info=True Packages to update: 970 Variants/Facets to change: 1 Create boot environment: No Create backup boot environment: Yes Services to change: 1 DOWNLOAD PKGS FILES XFER (MB) Completed 970/970 181/181 9.2/9.2 PHASE ACTIONS Install Phase 226/226 PHASE ITEMS Image State Update Phase 2/2 PHASE ITEMS Reading Existing Index 8/8 Indexing Packages 970/970 # Now we have the info files: $ ls -F /usr/share/info a2ps.info dir@ flex.info groff-2 regex.info aalib.info dired-x flex.info-1 groff-3 remember ...

    Read the article

  • Safely delete a TFS branch project

    - by Codesleuth
    I'm currently reorganising our TFS source control for a very large set of solutions, and I've done this successfully so far. I have a problem at the moment where I need to delete a legacy "Release Branch" TFS project that was branched for the old structure, and is no-longer required since I now host a release branch within the new structure. This is an example of how the source control now looks after moving everything: $/Source Project /Trunk /[Projects] /Release /[Projects] $/Release Branch Project /[Projects] /[Other legacy stuff] So far I've found information that says: tf delete /lock:checkout /recursive TestMain to delete a branch. TfsDeleteProject to delete a project tf delete seems to be only relevant when I need to delete a branch that is within the same project as the trunk, and TfsDeleteProject doesn't seem like it will delete the branch association from the source project (I hope I'm wrong, see below). Can someone tell me if the above will work, and in what order I should perform them in, to successfully delete the TFS $/Release Branch Project while also deleting the branch association (from right-click $/Source Project - Properties - Branches)?

    Read the article

  • Clean conflicting class files from Temporary ASP.NET Files

    - by Deepfreezed
    Class file Conflicts in C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\ is preventing me from building the solution. Even though I try emptying out the folder, each time Visual Studio starts the build process, it brings in the class file in to the temp folder with the same folder name. If I restart the machine or leave it overnight, project build without error. Is there anyway to tell Visual studio to delete/ignore/clean any lingering class files that could be in the temp folder? Clean solution option in VS doesn't work either. Class file in conflict are from the App_Code folder.

    Read the article

  • Link Maven OSGi to Maven NetBeans Platform Project

    - by mxro
    I am using NetBeans 6.9 Beta and I would like to accomplish the following: Set up a project representing the main application using Maven (for instance "Maven Project", "Maven NetBeans Application") Ideally, the project should only contain the necessary libraries to run in Apache Felix (I would like to be able to right-click the project and select "Run in Felix") I do not want that the project contains all the NetBean Platform APIs I would prefer to implement the modules using OSGi. For instance "Maven OSGi Bundle", "Maven NetBeans Module" + OSGi These are the problems, which I have at the moment: The standard Maven archetype ("Maven NetBeans Application") seems always to select all APIs and I have not found a way to deselect APIs - in normal NetBeans Platform Applications that can be accomplished by going to the project properties and deselected the platform modules) - I guess it has something to do with the NetBeans repository (http://bits.netbeans.org/maven2)? Do I have to create another repository? When creating normal "NetBeans Module" with OSGi support, the modules contain both NetBeans Module and OSGi meta data, which is nice. But the "Maven NetBeans Modules" have only NetBeans meta data and the Maven OSGi Bundles have only OSGi meta data). I figured out how to add modules to the project by using project / new and then placing the modules in the Maven project folder. However, I do not quite know yet how I could link to modules from other locations (NetBeans uses Maven modules, which have to be in the same directory as the project?). Below some useful links for Maven + OSGi in NetBeans wiki.netbeans.org/STS_69_Maven_OSGI NetBeans Maven OSGi Test Specification platform.netbeans.org/tutorials/nbm-maven-quickstart.html NetBeans Platform Quick Start Using Maven (6.9) wiki.netbeans.org/MavenBestPractices NetBeans Maven BestPractices maven.apache.org/pom.html#Aggregation Maven Documentation Multi-Module Projects (sorry about the missing protocol but couldn't post the message otherwise)

    Read the article

  • What does <project> is obsolete mean?

    - by dangerisgo
    My batch build has a project, lets call it 'My.Project'. That project got up-reved to .NET 3.5 (from .NET 1.1) in its own standalone project (meaning its not part of this batch build). Most of it is the same, there are calls that were upgraded, features added/removed. I go to start replacing all calls that all the other projects use to that, now old, project with new calls. I first remove the old reference and then add the new one. One of my projects only has one call. When I go to make the changes, one namespace in the new project throws a 'My.Project.Namespace1 is obsolete' while another call to that project throws nothing. Is there something I should be looking for? What would be causing this?

    Read the article

  • VS2008 VB project - Changing application type automatically adds references

    - by Stijn
    Visual Basic Create a new project with the Empty Project template (Visual Basic - Windows) Go to the project properties, and change the Application type by choosing something else or reselecting Windows Forms Application. When reselecting, Visual Studio will automatically add references to System.Deployment, System.Drawing and System.Windows.Forms C# Create a new project with the Empty Project template (Visual C# - Windows) Go to the project properties, and change the Application type to any of the choices. Visual studio will not add references. Question Is there a way to change this behaviour for Visual Basic?

    Read the article

  • Why are those modules being loaded in an ASP.NET project (not website)

    - by petergmagid
    I have an ASP.NET 3.5 Project (not website) and I don't understand why all these modules are being created and loaded. I thought that with a web project it would all compile to a single .DLL 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_fwtnlvuq.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_vb8hmtmg.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_v-nkuwgl.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_wn_uucrw.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_ngd_8nhu.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_8keebrhe.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_ohg9e50r.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_yhmgvhum.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_4qltywkk.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_1nml5ezc.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_cdju8bdk.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_xhugloto.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_rkqqzc0u.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_-vfyn7ik.dll', Symbols loaded. 'WebDev.WebServer.EXE' (Managed): Loaded 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\reviewstat_20\c147e006\64781866\App_Web_cthyzgij.dll', Symbols loaded.

    Read the article

  • Is it possible to exclude some files from checkin (TFS) ?

    - by Thomas Wanner
    We use configuration files within various projects under source control (TFS), where each developer has to make some adjustments in his local copy to configure his environment. The build process takes care about replacing the config files with the server configuration as a part of the deployment, so it doesn't actually matter what is in the repository. However, we would anyway like to keep some kind of a default non-breaking version of config files in the repository, so that e.g. people not involved in the particular project won't run into troubles because of local misconfiguration. We tried to resolve this by introducing the check-in policy that simply forbids to check-in the config files. This works fine, but just because we're lazy to always uncheck those checkboxes in the pending changes window, the question comes : is it possible to transparently disable the check-in of particular files without keeping them out of source control (e.g. locking their current version) ?

    Read the article

  • External project reference in android

    - by bryan costanich
    Hi all, I have an android application that utilizes another project as a project reference. i've gone into the application options, build path, and then selected the project in project references. everything builds and deploys fine, but when i try to run the application and instantiate a class in that project, i'm getting a java.lang.NoClassDefFoundError errrrrr. i'm no java project expert, so i assume i've cocked something simple up. any ideas?

    Read the article

  • What is the best practice to segment c#.net projects based on a single base project

    - by Anthony
    Honestly, I can't word my question any better without describing it. I have a base project (with all its glory, dlls, resources etc) which is a CMS. I need to use this project as a base for othe custom bake projects. This base project is to be maintained and updated among all custom bake projects. I use subversion (Collabnet and Tortise SVN) I have two questions: 1 - Can I use subversion to share the base project among other projects What I mean here is can I "Checkout" the base project into another "Checked Out" project and have both update and commit seperatley. So, to paint a picture, let's say I am working on a custom project and I modify the core/base prject in some way (which I know will suit the others) can I then commit those changes and upon doing so when I update the base project in the other "Checked out" resources will it pull the changes? In short, I would like not to have to manually deploy updated core files whenever I make changes into each seperate project. 2 - If I create a custom file (let's say an webcontrol or aspx page etc) can I have it compile seperatley from the base project Another tricky one to explain. When I publish my web application it creates DLLs based on the namespaces of projects attached to it. So I may have a number of DLLs including the "Website's" namespace DLL, which could simply be website. I want to be able to make a seperate, custom, control which does not compile into those DLLs as the custom files should not rely on those DLLS to run. Is it as simple to set a seperate namespace for those files like CustomFiles.ProjectName for example? Think of the whole idea as adding modules to the .NET project, I don't want the module's code in any of the core DLLs but I do need for module to be able to access the core dlls. (There is no need for the core project to access the module code as it should be one way only in theory, though I reckon it woould not be possible anyway without using JSON/SOAP or something like that, maybe I am wrong.) I want to create a pluggable environment much like that of Joomla/Wordpress as since PHP generally doesn't have to be compiled first I see this is the reason why all this is possible/easy. The idea is to allow pluggable themes, modules etc etc. (I haven't tried simply adding .NET themes after compile/publish but I am assuming this is possible anyway? OR does the compiler need to reference items in the files?)

    Read the article

  • How to convert Xml files to Text Files 2

    - by John
    Hi all, I have around 8000 xml files that needs to be converted into text files. The text file must contain title, description and keywords of the xml file without the tags and removing other elements and attributes as well. In other words, i need to create 8000 text files containing the title,description and keywords of the xml file. I need codings for this to be done systematically. Any help would be greatly appreciated. Thanks in advance. Hey all thank you all so so much with your replies. Here's a sample of what my xml looks like: <?xml version="1.0"?> <metadata> <identifier>43productionsNightatthegraveyard</identifier> <title>Night at the graveyard</title> <collection>opensource_movies</collection> <mediatype>movies</mediatype> <resource>movies</resource> <upload_application appid="ccPublisher" version="2.2.1"/> <uploader>[email protected]</uploader> <description>una noche en el cementerio (terror)</description> <license>http://creativecommons.org/licenses/by-nc/3.0/</license> <title>Night at the graveyard</title> <format>Video</format> <adder>[email protected]</adder> <licenseurl>http://creativecommons.org/licenses/by-nc/3.0/</licenseurl> <year>2007</year> <keywords>Night,at,the,graveyard,43,productions</keywords> <holder>43 productions</holder> <publicdate>2007-04-11 19:52:28</publicdate> </metadata> And this would be the output: una noche en el cementerio (terror) Night at the graveyard Night,at,the,graveyard,43,productions This need to be saved with the same name but in text format. Thanks all so much if any more suggestions would be much appreciated.

    Read the article

  • to write log files in two different files

    - by Sun
    my application run on customized client framework,the client framework used log4net to log their own log files. we are(our application) has to use the same log4net to log our log files in our own path(say our customized path). currently the our log files are created but log are not writing in that file.it is writting in the client framework log file. searched lot of sites the link http://stackoverflow.com/questions/308436/log4net-programmatcially-specify-multiple-loggers-with-multiple-file-appenders helped me to configure the log4net config programatically, still im log statemets are not written in my log file.the code used as below public class TraceLog { private string message = string.Empty; private static ILog ILogger = null; private static TraceLog instance = new TraceLog(); private TraceLog() { SetLevel("Log4net.MainForm", "ALL"); AddAppender("Log4net.MainForm", CreateFileAppender("FileAppender", "C:\\mylog.log")); } public static TraceLog Instance { get { return instance; } } public void Debug(string logMessage) { message = PrepareLog(logMessage); ILogger.Debug(message); } protected string PrepareLog(string logMessage) { string message = GetFileMethodLineNumberInfo(); message += logMessage; return message; } protected string GetFileMethodLineNumberInfo() { StackTrace stackTrace = new StackTrace(true); // The position 3 is relative to the index of the specified method StackFrame stackFrame = stackTrace.GetFrame(3); return (stackFrame.GetMethod().DeclaringType.Name + "/" + stackFrame.GetMethod().Name + "/" + stackFrame.GetFileLineNumber() + ":"); } private static void SetLevel(string loggerName, string levelName) { ILogger = LogManager.GetLogger(loggerName); log4net.Repository.Hierarchy.Logger l = (log4net.Repository.Hierarchy.Logger)ILogger.Logger; l.Level = l.Hierarchy.LevelMap[levelName]; } private static void AddAppender(string loggerName, IAppender appender) { ILogger = LogManager.GetLogger(loggerName); log4net.Repository.Hierarchy.Logger l = (log4net.Repository.Hierarchy.Logger)ILogger.Logger; l.AddAppender(appender); } private static IAppender CreateFileAppender(string name, string fileName) { FileAppender appender = new FileAppender(); appender.Name = name; appender.File = fileName; appender.AppendToFile = true; //PatternLayout layout = new PatternLayout(); //layout.ConversionPattern = "%d [%t] %-5p %c [%x] - %m%n"; //layout.ActivateOptions(); //appender.Layout = layout; appender.ActivateOptions(); return appender; } } } anyone pls help how to solve this

    Read the article

  • Extract files from zip folder and store these files in blobstore

    - by Eng_Engineer
    i want to upload zip folder from file input in form the i want to extract the contents of this uploaded zip folder,and store the contents (files)of this zip in the blobstore in order to download them after putting these files in one folder,but the problem is that i can't deal with the zip folder directly(to read it), i tried as this: form = cgi.FieldStorage() file_upload = form['file'] zip1=file_upload.filename zipstream=StringIO.StringIO(zip1.read()) But the problem still that i can't read the zip as previous,also i tried to read zip folder directly like this: z1=zipfile.ZipFile(zip1,"r") But there was an error in this way.Please can any one help me.Thanks in advance.

    Read the article

  • How to delete files with a Python script from a FTP server which are older than 7 days?

    - by Tom
    Hello I would like to write a Python script which allows me to delete files from a FTP Server after they have reached a certain age. I prepared the scipt below but it throws the error message: WindowsError: [Error 3] The system cannot find the path specified: '/test123/*.*' Do someone have an idea how to resolve this issue? Thank you in advance! import os, time from ftplib import FTP ftp = FTP('127.0.0.1') print "Automated FTP Maintainance" print 'Logging in.' ftp.login('admin', 'admin') # This is the directory that we want to go to directory = 'test123' print 'Changing to:' + directory ftp.cwd(directory) files = ftp.retrlines('LIST') print 'List of Files:' + files # ftp.remove('LIST') #------------------------------------------- now = time.time() for f in os.listdir(directory): if os.stat(f).st_mtime < now - 7 * 86400: if os.directory.isfile(f): os.remove(os.directory.join(directory, f)) #except: #exit ("Cannot delete files") #------------------------------------------- print 'Closing FTP connection' ftp.close()

    Read the article

  • How are files (especially audio files) organized internally?

    - by mystify
    I try to grok that: Apple is talking about "packets" in audio files, and there is a fancy function called AudioFileReadPackets which takes a lot of arguments. One of them specifies the "start packet", and another one the number of packets which you want to read. So I imagine an audio file to look like this, internally: It's made up of a lot of packets. If it's an audio file which has an variable bit rate format, then every packet may have a different size. If the file has an constant bit rate format, then every packet is the same size. So an audio file is like a truck full of boxes, and every box contains some interesting stuff. Is that correct? Does it apply to any kind of file? Is this how files actually look like?

    Read the article

  • Analyze VS2010 C# projects and report files on disk not part of the projects?

    - by Lasse V. Karlsen
    I discovered earlier tonight that files and folders I have removed from my C# projects are apparently still on disk, even though my Visual Studio Mercurial plugin seems to do a good job of deleting them when I delete them in Visual Studio. It must have hickuped when it came to these files. So I wondered... Does anyone have a script or similar, or know of something, that will look at my .csproj files and report extra files and folders on my disk that isn't part of the project files? I just want to clean up my repository contents.

    Read the article

  • Fragmented Log files could be slowing down your database

    - by Fatherjack
    Something that is sometimes forgotten by a lot of DBAs is the fact that database log files get fragmented in the same way that you get fragmentation in a data file. The cause is very different but the effect is the same – too much effort reading and writing data. Data files get fragmented as data is changed through normal system activity, INSERTs, UPDATEs and DELETEs cause fragmentation and most experienced DBAs are monitoring their indexes for fragmentation and dealing with it accordingly. However, you don’t hear about so many working on their log files. How can a log file get fragmented? I’m glad you asked. When you create a database there are at least two files created on the disk storage; an mdf for the data and an ldf for the log file (you can also have ndf files for extra data storage but that’s off topic for now). It is wholly possible to have more than one log file but in most cases there is little point in creating more than one as the log file is written to in a ‘wrap-around’ method (more on that later). When a log file is created at the time that a database is created the file is actually sub divided into a number of virtual log files (VLFs). The number and size of these VLFs depends on the size chosen for the log file. VLFs are also created in the space added to a log file when a log file growth event takes place. Do you have your log files set to auto grow? Then you have potentially been introducing many VLFs into your log file. Let’s get to see how many VLFs we have in a brand new database. USE master GO CREATE DATABASE VLF_Test ON ( NAME = VLF_Test, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test.mdf', SIZE = 100, MAXSIZE = 500, FILEGROWTH = 50 ) LOG ON ( NAME = VLF_Test_Log, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test_log.ldf', SIZE = 5MB, MAXSIZE = 250MB, FILEGROWTH = 5MB ); go USE VLF_Test go DBCC LOGINFO; The results of this are firstly a new database is created with specified files sizes and the the DBCC LOGINFO results are returned to the script editor. The DBCC LOGINFO results have plenty of interesting information in them but lets first note there are 4 rows of information, this relates to the fact that 4 VLFs have been created in the log file. The values in the FileSize column are the sizes of each VLF in bytes, you will see that the last one to be created is slightly larger than the others. So, a 5MB log file has 4 VLFs of roughly 1.25 MB. Lets alter the CREATE DATABASE script to create a log file that’s a bit bigger and see what happens. Alter the code above so that the log file details are replaced by LOG ON ( NAME = VLF_Test_Log, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test_log.ldf', SIZE = 1GB, MAXSIZE = 25GB, FILEGROWTH = 1GB ); With a bigger log file specified we get more VLFs What if we make it bigger again? LOG ON ( NAME = VLF_Test_Log, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL10.ROCK_2008\MSSQL\DATA\VLF_Test_log.ldf', SIZE = 5GB, MAXSIZE = 250GB, FILEGROWTH = 5GB ); This time we see more VLFs are created within our log file. We now have our 5GB log file comprised of 16 files of 320MB each. In fact these sizes fall into all the ranges that control the VLF creation criteria – what a coincidence! The rules that are followed when a log file is created or has it’s size increased are pretty basic. If the file growth is lower than 64MB then 4 VLFs are created If the growth is between 64MB and 1GB then 8 VLFs are created If the growth is greater than 1GB then 16 VLFs are created. Now the potential for chaos comes if the default values and settings for log file growth are used. By default a database log file gets a 1MB log file with unlimited growth in steps of 10%. The database we just created is 6 MB, let’s add some data and see what happens. USE vlf_test go -- we need somewhere to put the data so, a table is in order IF OBJECT_ID('A_Table') IS NOT NULL DROP TABLE A_Table go CREATE TABLE A_Table ( Col_A int IDENTITY, Col_B CHAR(8000) ) GO -- Let's check the state of the log file -- 4 VLFs found EXECUTE ('DBCC LOGINFO'); go -- We can go ahead and insert some data and then check the state of the log file again INSERT A_Table (col_b) SELECT TOP 500 REPLICATE('a',2000) FROM sys.columns AS sc, sys.columns AS sc2 GO -- insert 500 rows and we get 22 VLFs EXECUTE ('DBCC LOGINFO'); go -- Let's insert more rows INSERT A_Table (col_b) SELECT TOP 2000 REPLICATE('a',2000) FROM sys.columns AS sc, sys.columns AS sc2 GO 10 -- insert 2000 rows, in 10 batches and we suddenly have 107 VLFs EXECUTE ('DBCC LOGINFO'); Well, that escalated quickly! Our log file is split, internally, into 107 fragments after a few thousand inserts. The same happens with any logged transactions, I just chose to illustrate this with INSERTs. Having too many VLFs can cause performance degradation at times of database start up, log backup and log restore operations so it’s well worth keeping a check on this property. How do we prevent excessive VLF creation? Creating the database with larger files and also with larger growth steps and actively choosing to grow your databases rather than leaving it to the Auto Grow event can make sure that the growths are made with a size that is optimal. How do we resolve a situation of a database with too many VLFs? This process needs to be done when the database is under little or no stress so that you don’t affect system users. The steps are: BACKUP LOG YourDBName TO YourBackupDestinationOfChoice Shrink the log file to its smallest possible size DBCC SHRINKFILE(FileNameOfTLogHere, TRUNCATEONLY) * Re-size the log file to the size you want it to, taking in to account your expected needs for the coming months or year. ALTER DATABASE YourDBName MODIFY FILE ( NAME = FileNameOfTLogHere, SIZE = TheSizeYouWantItToBeIn_MB) * – If you don’t know the file name of your log file then run sp_helpfile while you are connected to the database that you want to work on and you will get the details you need. The resize step can take quite a while This is already detailed far better than I can explain it by Kimberley Tripp in her blog 8-Steps-to-better-Transaction-Log-throughput.aspx. The result of this will be a log file with a VLF count according to the bullet list above. Knowing when VLFs are being created By complete coincidence while I have been writing this blog (it’s been quite some time from it’s inception to going live) Jonathan Kehayias from SQLSkills.com has written a great article on how to track database file growth using Event Notifications and Service Broker. I strongly recommend taking a look at it as this is going to catch any sneaky auto grows that take place and let you know about them right away. Hassle free monitoring of VLFs If you are lucky or wise enough to be using SQL Monitor or another monitoring tool that let’s you write your own custom metrics then you can keep an eye on this very easily. There is a custom metric for VLFs (written by Stuart Ainsworth) already on the site and there are some others there are very useful so take a moment or two to look around while you are there. Resources MSDN – http://msdn.microsoft.com/en-us/library/ms179355(v=sql.105).aspx Kimberly Tripp from SQLSkills.com – http://www.sqlskills.com/BLOGS/KIMBERLY/post/8-Steps-to-better-Transaction-Log-throughput.aspx Thomas LaRock at Simple-Talk.com – http://www.simple-talk.com/sql/database-administration/monitoring-sql-server-virtual-log-file-fragmentation/ Disclosure I am a Friend of Red Gate. This means that I am more than likely to say good things about Red Gate DBA and Developer tools. No matter how awesome I make them sound, take the time to compare them with other products before you contact the Red Gate sales team to make your order.

    Read the article

  • Exclusive Webcast Series Explains How Project Success Drives Business Success

    - by Melissa Centurio Lopes
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} In the wake of the global financial crisis, organizations throughout the world are redoubling their efforts to enhance financial discipline, achieve operational excellence, and mitigate risk. How can they address all these areas with one comprehensive strategy? With enterprise project portfolio management solutions that provide greater transparency and visibility across all projects and portfolios, says Guy Barlow, Oracle director of industry strategy. In the following interview and in an exclusive, three-part webcast series, Barlow examines today’s new management realities and explains how organizations can succeed in this environment. Q: Financial discipline has always been important, what’s different today? A: A number of organizations are showing that by fiscally aligning projects with the business goals of their organizations, they can shave off hundreds of thousands if not millions of dollars in inefficiency and waste. For example, one Oracle customer, the Columbus Regional Airport Authority, reduced its unbudgeted costs from US$24.4 million to US$3.5 million, for an 88 percent improvement. Q: How do organizations achieve results like this? A: First, they need to have the vision to see project management as part of a broad and critical element in their overall enterprise strategy. That means using a single solution, such as Oracle‘s Primavera, to manage multiple projects across multiple functions within a company. So someone in corporate mergers and acquisitions as well as a capital projects team can standardize on the same technology. By doing so they all gain greater efficiency in planning and execution—because the technology can be configured for their specific roles and needs—and the IT organization really benefits from lower maintenance. Second, enterprises must give executive leaders—CFOs, COOs, and CEOs—visibility across the entire business to easily see what projects are on track and which ones are falling behind. In fact, once executives see the power of enterprise project portfolio management, uptake is very quick across the organization. Read the full interview here.

    Read the article

  • When building a web application project, TFS 2008 builds two separate projects in _PublishedFolder.

    - by Steve Johnson
    I am trying to perform build automation on one of my web application projects built using VS 2008. The _PublishedWebSites contains two folders: Web and Deploy. I want TFS 2008 to generate only the deploy folder and not the web folder. Here is my TFSBuild.proj file: <Project ToolsVersion="3.5" DefaultTargets="Compile" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\TeamBuild\Microsoft.TeamFoundation.Build.targets" /> <Import Project="$(MSBuildExtensionsPath)\Microsoft\WebDeployment\v9.0\Microsoft.WebDeployment.targets" /> <ItemGroup> <SolutionToBuild Include="$(BuildProjectFolderPath)/../../Development/Main/MySoftware.sln"> <Targets></Targets> <Properties></Properties> </SolutionToBuild> </ItemGroup> <ItemGroup> <ConfigurationToBuild Include="Release|AnyCPU"> <FlavorToBuild>Release</FlavorToBuild> <PlatformToBuild>Any CPU</PlatformToBuild> </ConfigurationToBuild> </ItemGroup> <!--<ItemGroup> <SolutionToBuild Include="$(BuildProjectFolderPath)/../../Development/Main/MySoftware.sln"> <Targets></Targets> <Properties></Properties> </SolutionToBuild> </ItemGroup> <ItemGroup> <ConfigurationToBuild Include="Release|x64"> <FlavorToBuild>Release</FlavorToBuild> <PlatformToBuild>x64</PlatformToBuild> </ConfigurationToBuild> </ItemGroup>--> <ItemGroup> <AdditionalReferencePath Include="C:\3PR" /> </ItemGroup> <Target Name="GetCopyToOutputDirectoryItems" Outputs="@(AllItemsFullPathWithTargetPath)" DependsOnTargets="AssignTargetPaths;_SplitProjectReferencesByFileExistence"> <!-- Get items from child projects first. --> <MSBuild Projects="@(_MSBuildProjectReferenceExistent)" Targets="GetCopyToOutputDirectoryItems" Properties="%(_MSBuildProjectReferenceExistent.SetConfiguration); %(_MSBuildProjectReferenceExistent.SetPlatform)" Condition="'@(_MSBuildProjectReferenceExistent)'!=''"> <Output TaskParameter="TargetOutputs" ItemName="_AllChildProjectItemsWithTargetPathNotFiltered"/> </MSBuild> <!-- Remove duplicates. --> <RemoveDuplicates Inputs="@(_AllChildProjectItemsWithTargetPathNotFiltered)"> <Output TaskParameter="Filtered" ItemName="_AllChildProjectItemsWithTargetPath"/> </RemoveDuplicates> <!-- Target outputs must be full paths because they will be consumed by a different project. --> <CreateItem Include="@(_AllChildProjectItemsWithTargetPath->'%(FullPath)')" Exclude= "$(BuildProjectFolderPath)/../../Development/Main/Web/Bin*.pdb; *.refresh; *.vshost.exe; *.manifest; *.compiled; $(BuildProjectFolderPath)/../../Development/Main/Web/Auth/MySoftware.dll; $(BuildProjectFolderPath)/../../Development/Main/Web/BinApp_Web_*.dll;" Condition="'%(_AllChildProjectItemsWithTargetPath.CopyToOutputDirectory)'=='Always' or '%(_AllChildProjectItemsWithTargetPath.CopyToOutputDirectory)'=='PreserveNewest'" > <Output TaskParameter="Include" ItemName="AllItemsFullPathWithTargetPath"/> <Output TaskParameter="Include" ItemName="_SourceItemsToCopyToOutputDirectoryAlways" Condition="'%(_AllChildProjectItemsWithTargetPath.CopyToOutputDirectory)'=='Always'"/> <Output TaskParameter="Include" ItemName="_SourceItemsToCopyToOutputDirectory" Condition="'%(_AllChildProjectItemsWithTargetPath.CopyToOutputDirectory)'=='PreserveNewest'"/> </CreateItem> </Target> <!-- To modify your build process, add your task inside one of the targets below and uncomment it. Other similar extension points exist, see Microsoft.WebDeployment.targets. <Target Name="BeforeBuild"> </Target> <Target Name="BeforeMerge"> </Target> <Target Name="AfterMerge"> </Target> <Target Name="AfterBuild"> </Target> --> </Project> I want to build everything that the builtin Deploy project is doing for me. But I don't want the generated web project as it contains App_Web_xxxx.dll assemblies instead of a single compiled assembly. How can I do this?

    Read the article

  • How to find hidden/cloak files in Windows 2003?

    - by homemdelata
    Here is the point. I set Windows to display all the hidden files and protected operating system files but even after that, my antivirus (Kaspersky) is still getting a ".dll" file on "c:\windows\system32" saying it's a riskware 'Hidden.Object'. I tried to find this file everytime but it's not there. So I asked one of the developers to create a service that verifies the folder each 5 seconds and, if it founds the file, copies to another place. If it copies to another place with the same name and extension, I still can't find the file on the other folder but Kaspersky now founds both. If I keep the same name but with a different extension, like ".temp123", I still can't find the file. Lastly, I created an empty text file and renamed with the same name as the other one, the file just gone too. After all this research It's clear that every file with this same name on this specific server gets cloak, doesn't matter the file extension. I created a file with this same name on another server and nothing happens, the file is still there without problem. How can I found this kind of file? How can I "uncloak" it? How can I know what this file is doing?

    Read the article

  • How many files in a directory is too many?

    - by Kip
    Does it matter how many files I keep in a single directory? If so, how many files in a directory is too many, and what are the impacts of having too many files? (This is on a Linux server.) Background: I have a photo album website, and every image uploaded is renamed to an 8-hex-digit id (say, a58f375c.jpg). This is to avoid filename conflicts (if lots of "IMG0001.JPG" files are uploaded, for example). The original filename and any useful metadata is stored in a database. Right now, I have somewhere around 1500 files in the images directory. This makes listing the files in the directory (through FTP or SSH client) take a few seconds. But I can't see that it has any affect other than that. In particular, there doesn't seem to be any impact on how quickly an image file is served to the user. I've thought about reducing the number of images by making 16 subdirectories: 0-9 and a-f. Then I'd move the images into the subdirectories based on what the first hex digit of the filename was. But I'm not sure that there's any reason to do so except for the occasional listing of the directory through FTP/SSH.

    Read the article

  • JavaScript Intellisense with Telerik in ASP.NET Master Page Project with VS 2010

    - by Otto Neff
    Today I was looking for a solution to get finally the JScript/Javascript/jQuery Intellisense Featureworking with my ASP.Net Webform Project to work. I found some good articles: - JScript IntelliSense Overview- JScript IntelliSense: A Reference for the “Reference” Tag- Enabling JavaScript intellisense in VS.NET 2010 to work with SharePoint 2010- Rich IntelliSense for jQueryBUT, all of suggested solutions did not work right with my Master Page based Visual Studio 2010 Solution.Only with physical Javascript Files (Telerik includes certain Javascript Files like jQuery as Ressource) or/andconfigure always a new ASP.NET Scriptmanager / RadScriptManager on every page derived from the Master Page, wasn't exactly what I was looking for. So I came up with the following simple Solution, to Trick VS2010and still have the Project running with multiple runat="server" Scriptmanagers. In short:- New ASP.NET control derived from ScriptManager with emtpy overwritten OnInit() to use it as emtpy wrapper for VS2010. In detail:New RadScriptManager Classusing System; using System.Collections.Generic; using System.Linq; using System.Web; using Telerik.Web.UI; namespace IntellisenseJavascript.Controls { public class IntelliJS : RadScriptManager { protected override void OnInit(EventArgs e) { } protected override void OnPreRender(EventArgs e) { } protected override void OnLoad(EventArgs e) { } protected override void Render(System.Web.UI.HtmlTextWriter writer) { } public override void RenderControl(System.Web.UI.HtmlTextWriter writer) { } } } web.config<configuration> ... <system.web> ... <pages> <controls> <add tagPrefix="telerik" namespace="Telerik.Web.UI" assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4"/> <add tagPrefix="VSFix" namespace="IntellisenseJavascript.Controls" assembly="IntellisenseJavascript"/> </controls> </pages> ... Master Page<%@ Master Language="C#" AutoEventWireup="true" CodeBehind="Site.master.cs" Inherits="IntellisenseJavascript.Site" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html > <head id="head" runat="server"> <title></title> <telerik:RadStyleSheetManager ID="radStyleSheetManager" runat="server" /> </head> <body> <form id="form" runat="server"> <telerik:RadScriptManager ID="radScriptManager" runat="server"> <Scripts> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.Core.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQuery.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQueryInclude.js" /> </Scripts> </telerik:RadScriptManager> <telerik:RadAjaxManager ID="radAjaxManager" runat="server"> </telerik:RadAjaxManager> <div> #MASTER CONTENT# <asp:ContentPlaceHolder ID="contentPlaceHolder" runat="server"> </asp:ContentPlaceHolder> </div> </form> <script type="text/javascript"> $(function () { // Masterpage ready $('body').css('margin', '50px'); }); </script> </body> </html> ASPX Page<%@ Page Title="" Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="IntellisenseJavascript.Default" %> <asp:Content ID="Content1" ContentPlaceHolderID="contentPlaceHolder" runat="server"> <VSFix:IntelliJS runat="server" ID="intelliJS"> <Scripts> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.Core.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQuery.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQueryInclude.js" /> </Scripts> </VSFix:IntelliJS> <div style="border: 5px solid #FF9900;"> #PAGE CONTENT# </div> <script type="text/javascript"> $(function () { // Page ready $('body').css('border', '5px solid #888'); }); </script> </asp:Content> The Result I know, this is not the way it meant to be... but now at least you can have a Main ScriptManager for all Common Scripts and Settings, inject page specific Javascripts in PageLoad Event in normal ASPX Files and have JavaScript Intellisense for defined Scripts from JS Files or Assembly Ressouce in your Content Maybe, vNext will fix this.

    Read the article

< Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >