Search Results

Search found 74849 results on 2994 pages for 'file folder'.

Page 185/2994 | < Previous Page | 181 182 183 184 185 186 187 188 189 190 191 192  | Next Page >

  • Break a hard link of a file in use

    - by Stebi
    I used hard links to merge duplicated files on my SSD (space is still precious) and now have a weird problem. Common files like msvcr110.dll got hard linked. Now I want to delete a program which has this file in its installation directory. But I cannot because this file (on another location) is used by a currently running application (don't know which) and windows doesn't allow me to delete this file because it's in use. I can rename the file but it still points to the same file, so not possible to delete it. Is there any way to break a the hard link of a file which is currently in use? I currently use a trash folder where I move those files to so I can delete the directory structure of program to be deleted. But I'd like to get rid of this leftover (although it doesn't take much space as it's a hard link).

    Read the article

  • Panic Transmit file upload

    - by 1ndivisible
    I've ditched Coda and bought Transmit. I'm a little confused by the file uploading. I have exactly the same folder structure remotely and locally, but if I right-click a file and choose Upload "SomeFileName.html" The file is always uploaded into the root of the remote site, even if the file is in a folder. If I choose to upload a file at assets/images/some_image.png I would expect it to be uploaded to the same folder on the remote server, not the root. Coda dealt with this perfectly and also told me what files had been modified and needed uploading. Transmit doesn't seem to do either of these things. So my questions are: How can I upload a file to the same path on the remote server without having to drag and drop Is there any way to have Transmit mark edited files or upload only edited files. [There is no tag for Transmit so if someone with more rep could make and add one that would be grand]

    Read the article

  • DNS Zone file and virtual host question

    - by Jake
    Hi all, I'm trying to set up a virtual host for redmine.SITENAME.com. I've edited the httpd.conf file and now I'm trying to edit my DNS settings. However, I'm not sure exactly what to do. Here's an snippet of what's already in the named.conf file (the file was made by someone else who is unreachable): zone "SITENAME.com" { type master; file "SITENAME.com"; allow-transfer { ip.address.here.00; common-allow-transfer; }; }; I figure if I want to get redmine.SITENAME.com working, I need to copy that entry and just replace SITENAME.com with redmine.SITENAME.com but will that work? I was under the impression I needed a .db file but I don't see any reference to one in the current named.conf file. Any advice would be great and if you need more info to answer the question, don't hesitate to ask.

    Read the article

  • Moving folders take long in windows 7

    - by acidzombie24
    What can i do to fix this? maybe drop permission properties? maybe not. I have a large folder with 100k files. I moved it into my archive folder and its taking forever to move. Why is that? I know on XP it takes <1sec but not on windows 7. I am sure its a permission thing, is there a way i can disable it and make it faster?

    Read the article

  • Identify "Composite Document File"

    - by Steven
    In a folder containing several PowerPoint Presentations and Spreadsheets, I discovered the following file: Name: ppt115.tmp Size: 160 MB Meta: No EXIF or other metadata Type: (as identified by the cygwin / linux program 'file') Composite Document File V2 Document, No summary info Notes: The filename does not correspond to other files in the directory. Neither MS Power Point nor Excel can open the file. MS Word will only attempt to recover text. Please help me identify this file. Is it just a temporary file that I can safely remove?

    Read the article

  • Problem opening password encrypted .docx file on Word 2003

    - by molecule
    Hi all, I am having a problem opening a .docx file on my Word 2003. I have installed the Compatibility pack for 2007 but when i try to open this particular file, I receive the error "Word experienced an error trying to open the file. Try these suggestions. 1. Check the file permissions for the document, 2. Make sure there is sufficient free memory and disk space, 3. Open the file with the Text Recovery converter. I do not think it is any of the errors as I am able to open it on a different PC with Word 2003 as well. I also do not have any issues opening any non-password encrypted .docx files. Has anyone experienced the same issue? Most posts on the internet relate to "open and repair" but as mentioned, I am able to open this file on another PC without any problems. Any advice is greatly appreciated. Thanks, George

    Read the article

  • How to troubleshoot if a zip file is valid or if it is big file size to be unzipped ?

    - by mireille raad
    Hello , I am trying to unzip a file with the size of 2GB I am getting the following error : unzip CLTE_C_08.zip Archive: CLTE_C_08.zip End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. unzip: cannot find zipfile directory in one of CLTE_C_08.zip or CLTE_C_08.zip.zip, and cannot find CLTE_C_08.zip.ZIP, period. After some googling, some people say that this error is because the file is too big, others say because file is corrupt, others say that it could be a not unix archive. So my question , how to find out if file is valid archive file on my Centos and what is the command/trick to uncompress big files ( if any ) Thanks in advance :)

    Read the article

  • Force Windows 8 to search indexed files

    - by Hrvoje
    When using search files in Windows 8 (win+f) I don't get expected results. For example, I installed VLC, it's in Program Files (86) folder, and that folder is selected for indexing. Search for files (win+f) gives 0 results. If I pin to start that exe, then it's found - but I don't want to do this, that's not the point. Where does it search for files? Is there any way to specify search locations? It doesn't use Indexing Options settings, at least it seams so. Also, searching from explorer window is kinda slow - I tried entering VLC.EXE in search box (when in c:\ root), and it takes some time to give correct results. It works, but it looks like it doesn't use indexing, rather scan all files/folders, which is slow.

    Read the article

  • How to troubleshoot if a zip file is valid or if it is big file size to be unzipped ?

    - by mireille raad
    Hello , I am trying to unzip a file with the size of 2GB I am getting the following error : unzip CLTE_C_08.zip Archive: CLTE_C_08.zip End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. unzip: cannot find zipfile directory in one of CLTE_C_08.zip or CLTE_C_08.zip.zip, and cannot find CLTE_C_08.zip.ZIP, period. After some googling, some people say that this error is because the file is too big, others say because file is corrupt, others say that it could be a not unix archive. So my question , how to find out if file is valid archive file on my Centos and what is the command/trick to uncompress big files ( if any ) Thanks in advance :)

    Read the article

  • unable to transfer files from handy cam to PC

    - by user143989
    I am using a Windows 7 PC,I am using sony dcr -sr88 handy cam . I need to transfer all my videos from handycam to my PC. when i try to connect to the PC through USB. it detects the usb drive in the Handycam on my PC and shows the used memory. But when i open the folder it shows "folder is empty". How i can copy the files? I have tried following: Changed the USB cable CHanged the USB port I can play the videos through handicam, but those files not visible in PC when connected in USB mode. Please help ..bit urgent!

    Read the article

  • Require file for mount and also update the file after mount?

    - by Andy Shinn
    I am trying to make sure a directory exists for a mount and then also update the permissions of that directory after the mount happens. I am receiving the following error: err: Failed to apply catalog: Cannot alias File[pre_eos_mount] to ["/var/tmp/eos"] at /etc/puppet/modules/mymodule/manifests/eos.pp:29; resource ["File", "/var/tmp/eos"] already declared at /etc/puppet/modules/mymodule/manifests/eos.pp:47 I would like to do something like this: file { $eos_mount : ensure = 'directory', mode = '1777', owner = 'root', group = 'root', } mount { $eos_mount : ensure = 'mounted', device = $lv_device, fstype = $fstype, options = 'defaults,noatime,nodiratime', require = File[$eos_mount], notify = File['post_eos'], } file { 'post_eos' : path = $eos_mount, ensure = 'directory', mode = '1777', owner = 'root', group = 'root', } What is a way to ensure permissions of a folder after it has been mounted?

    Read the article

  • Building a Student Storage server

    - by DobotJr
    I work for a school district. I've been put in charge of building a storage server for students. A place for them to work off of from school and home. My challenge is getting this to work from home. At school they login, authenticate, and they get a mapped drive to their folder on the server (S:\fileserver\studentname). My question is how can I make this available to students at home? The server is running Windows Server 2003 R1. I've got PHP, Apache, and MySQL working together. My idea is to write a script that will "crawl" through the directory containing all of the student folders, then create an instance of every file and folder in a MySQL DB. Create a login page that will use LDAP for authentication, and once they login to the server from home, they get a page with folders a files tied to their username. Has anyone out there ever put something like this together??

    Read the article

  • Access to certain files but not others

    - by ADW
    Hoping someone can help me as I have, thus far, been unable to solve the issue. I am running a media center utilizing Ubuntu 12.04. I was initially successful accessing media files from the desktop running Ubuntu via my Windows 7 laptop and Roku device. I started backing up a new batch of DVD's I had (into MKV files, like everything else in my media folders) and noticed I cannot access the new files from either the Roku or the laptop. I have not changed any settings in the media folder and verified the shared permissions. The parent folder (Media) is shared (with permission flow-down) while the subfolders (Movies, TV Shows, Music) are not. I have changed the permissions on this to include shared when the access problem arose but with no success. I can only access the original files uploaded an not new files added. Any suggestions??? Thanks in advance for any and all help.

    Read the article

  • Moses v1.0 multi language ini file

    - by Milan Kocic
    I was working with mosesserver 0.91 and everything works fine but now there is version 1.0 and nothing is same as before. Here is my situation: I want to have multi language translation from arabic to english and from english to arabic. All data and configuration file I have works with 0.91 version of mosesserver. Here is my config file: ------------------------------------------------- ######################### ### MOSES CONFIG FILE ### ######################### # D - decoding path, R - reordering model, L - language model [translation-systems] ar-en D 0 R 0 L 0 en-ar D 1 R 1 L 1 # input factors [input-factors] 0 # mapping steps [mapping] 0 T 0 1 T 1 # translation tables: table type (hierarchical(0), textual (0), binary (1)), source-factors, target-factors, number of scores, file # OLD FORMAT is still handled for back-compatibility # OLD FORMAT translation tables: source-factors, target-factors, number of scores, file # OLD FORMAT a binary table type (1) is assumed [ttable-file] 1 0 0 5 /mnt/models/ar-en/phrase-table/phrase-table 1 0 0 5 /mnt/models/en-ar/phrase-table/phrase-table # no generation models, no generation-file section # language models: type(srilm/irstlm), factors, order, file [lmodel-file] 1 0 5 /mnt/models/ar-en/language-model/en.qblm.mm 1 0 5 /mnt/models/en-ar/language-model/ar.lm.d1.blm.mm # limit on how many phrase translations e for each phrase f are loaded # 0 = all elements loaded [ttable-limit] 20 # distortion (reordering) files [distortion-file] 0-0 wbe-msd-bidirectional-fe-allff 6 /mnt/models/ar-en/reordering-table/reordering-table.wbe-msd-bidirectional-fe.gz 0-0 wbe-msd-bidirectional-fe-allff 6 /mnt/models/en-ar/reordering-model/reordering-table.wbe-msd-bidirectional-fe.gz # distortion (reordering) weight [weight-d] 0.3 0.3 # lexicalised distortion weights [weight-lr] 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 0.3 # language model weights [weight-l] 0.5000 0.5000 # translation model weights [weight-t] 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 # no generation models, no weight-generation section # word penalty [weight-w] -1 -1 [distortion-limit] 12 --------------------------------------------------------- So please can someone help me and rewrite this config file so it can work in version 1.0. And i need some python sample code of translation. I am using xmlrpc in python and earler I sent http request with: import xmlrpclib client = xmlrpclib.ServerProxy('http://localhost:8080') client.translate({'text': 'some text', 'system': 'en-ar'}) but now seems there is no more 'system' parameter and moses use always default settings.

    Read the article

  • Doxygen: grouping documentation by folder in a multi-project codebase

    - by John
    In one project, some pages were added - it was a new project in which doxygen was being tested properly - adding comments & pages - rather than simply auto-generating docs from our existing code-base. The problem is when doxygen is run on the main code-base, that project's pages show up at a top level. e.g they have a main-page and some sub-pages. But what we'd want is all those pages pushed down one level so you have main-project-main-project-pages. One question is what happens if multiple projects have a main-page? Do they get combined, or throw errors? Another question is if you can tell doxygen to use paths and containing folders to auto-generate groups or sections or page-hierarchies in some way? Going through all out projects to properly assign classes to groups is a mammoth task, so ideally everything in a directory would get put in a group of that name, as a way to make the documentation of non-doxygen codebases better. Sorry my question is a bit vague, the problem is even after reading the docs the terminology isn't totally clear yet. Hopefully the kind of question I'm asking is clear, if not I'll try to cobble an example file-structure together.

    Read the article

  • Loading velocity template inside a jar file

    - by Rafael
    I have a project where I want to load a velocity template to complete it with parameters. The whole application is packaged as a jar file. What I initially thought of doing was this: VelocityEngine ve = new VelocityEngine(); URL url = this.getClass().getResource("/templates/"); File file = new File(url.getFile()); ve = new VelocityEngine(); ve.setProperty(RuntimeConstants.RESOURCE_LOADER, "file"); ve.setProperty(RuntimeConstants.FILE_RESOURCE_LOADER_PATH, file.getAbsolutePath()); ve.setProperty(RuntimeConstants.FILE_RESOURCE_LOADER_CACHE, "true"); ve.init(); VelocityContext context = new VelocityContext(); if (properties != null) { stringfyNulls(properties); for (Map.Entry<String, Object> property : properties.entrySet()) { context.put(property.getKey(), property.getValue()); } } final String templatePath = templateName + ".vm"; Template template = ve.getTemplate(templatePath, "UTF-8"); String outFileName = File.createTempFile("p2d_report", ".html").getAbsolutePath(); BufferedWriter writer = new BufferedWriter(new FileWriter(new File(outFileName))); template.merge(context, writer); writer.flush(); writer.close(); And this works fine when I run it in eclipse. However, once I package the program and try to run it using the command line I get an error because the file could not be found. I imagine the problem is in this line: ve.setProperty(RuntimeConstants.FILE_RESOURCE_LOADER_PATH, file.getAbsolutePath()); Because in a jar the absolute file does not exist, since it's inside a zip, but I couldn't yet find a better way to do it. Anyone has any ideas?

    Read the article

  • Why does first call to java.io.File.createTempFile(String,String,File) take 5 seconds on Citrix?

    - by Ben Roling
    While debugging slow startup of an Eclipse RCP app on a Citrix server, I came to find out that java.io.createTempFile(String,String,File) is taking 5 seconds. It does this only on the first execution and only for certain user accounts. Specifically, I am noticing it Citrix anonymous user accounts. I have not tried many other types of accounts, but this behavior is not exhibited with an administrator account. Also, it does not matter if the user has access to write to the given directory or not. If the user does not have access, the call will take 5 seconds to fail. If they do have access, the call with take 5 seconds to succeed. This is on a Windows 2003 Server. I've tried Sun's 1.6.0_16 and 1.6.0_19 JREs and see the same behavior. I googled a bit expecting this to be some sort of known issue, but didn't find anything. It seems like someone else would have had to have run into this before. The Eclipse Platform uses File.createTempFile() to test various directories to see if they are writeable during initialization and this issue adds 5 seconds to the startup time of our application. I imagine somebody has run into this before and might have some insight. Here is sample code I executed to see that it is indeed this call that is consuming the time. I also tried it with a second call to createTempFile and notice that subsequent calls return nearly instantaneously. public static void main(final String[] args) throws IOException { final File directory = new File(args[0]); final long startTime = System.currentTimeMillis(); File file = null; try { file = File.createTempFile("prefix", "suffix", directory); System.out.println(file.getAbsolutePath()); } finally { System.out.println(System.currentTimeMillis() - startTime); if (file != null) { file.delete(); } } } Sample output of this program is the following: C:\java.exe -jar filetest.jar C:/Temp C:\Temp\prefix8098550723198856667suffix 5093

    Read the article

  • Rails upload file to ftp server

    - by Bob
    I'm on Rails 2.3.5 and Ruby 1.8.6 and trying to figure out how to let a user upload a file to a FTP server on a different machine than my Rails app. Also my Rails app will be hosted on Heroku which doesn't facilitate the writing of files to the local filesystem. index.html.erb <% form_tag '/ftp/upload', :method => :post, :multipart => true do %> <label for="file">File to Upload</label> <%= file_field_tag "file" %> <%= submit_tag 'Upload' %> <% end %> ftp_controller.rb require 'net/ftp' class FtpController < ApplicationController def upload file = params[:file] ftp = Net::FTP.new('remote-ftp-server') ftp.login(user = "***", passwd = "***") ftp.puttextfile(file.read, File.basename(file.original_filename)) ftp.quit() end def index end end Currently I'm just trying to get the Rails app to work on my Windows laptop. With the above code, I'm getting this error Errno::ENOENT in FtpController#upload No such file or directory -.... followed by a dump of the file contents Anyone knows what's going on?

    Read the article

  • AIR File.resolvePath won't work anymore

    - by Palleas
    Hi all, I'm having a very strange issue, it looks like my application can't create file anymore. It works w/ directories, but the so-many-times-used resolvePath() methods doesn't. Here is what I do : var databaseFileContent : File = new File(File.desktopDirectory.nativePath + "/testing"); databaseFileContent.createDirectory(); databaseFileContent.resolvePath("test"); (Here I'm trying on desktop but that's the same w/ applicationStorageDirectory) When I execute this, it works only for the "testing" folder which is actually created, but my file isn't. I tried to create another application, doing this : trace(File.desktopDirectory.resolvePath("maiswtf.db").exists); trace(File.applicationStorageDirectory.resolvePath("wtf.db").exists); Both are displaying "false". Am I missing something here? I have another application with this code : var databaseFileContent : File = File.applicationStorageDirectory.resolvePath(File.separator + "sitra.db"); When I run this one, it works perfectly! My file is created at /sitra.db! Any hints? I thinks I'm going mad :/ Thanks!

    Read the article

  • wp+sql+image not goin in the folder

    - by happy
    this is my code for uploading image in database but image are going to the desird forlder...but when i m tryin to retrieve the images to diaplay,,they are not displayed..anyone help me...... $category=$_POST['category']; $uploadDir = 'D:/xampp/htdocs/js/wordpress/wp-content/plugins/img/imagess/ '; $fileName = $_FILES['Photo']['name']; $tmpName = $_FILES['Photo']['tmp_name']; $fileSize = $_FILES['Photo']['size']; $fileType = $_FILES['Photo']['type']; $filePath = $uploadDir . $fileName; $result = move_uploaded_file($tmpName,$filePath); if (!$result) { echo "Error uploading file"; exit; } if(!get_magic_quotes_gpc()) { $fileName = addslashes($fileName); $filePath = addslashes($filePath); } global $wpdb; //$insert=$wpdb->insert('images',array('image_name'=>$filePath,'cat_name'=>$category),array('%b','%s')); $insert=$wpdb->insert('images',array('image_name'=>$filePath,'cat_name'=>$category)); $wpdb->insert('categories',array('cat_name'=>$category)); echo "Successfully Submitted";

    Read the article

  • How to create a Global Rule that stores a document’s folder path in a custom metadata field

    - by Nicolas Montoya
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} How to create a Global Rule that stores a document’s folder path in a custom metadata field Efficiency purists would argue that redundancy is not necessary. In real life, we are willing to pay a price for performance –i.e. to have information at our fingertips. We have run into customers opting to store a document folder path as a document metadata field. They have their reasons, half of the ECM community will agree with them, and the other half would raise an eye brow. In the end, they are getting creative to achieve their document management goals. The below steps outlines how to create a Global Rule that would store a document’s folder path in a custom metadata field: Create a Global Rule via Configuration Manager > Rules Tab > Add Then check “Is global rule with priority”. Then check “Use rule activation condition”. The go to “Edit” and check the actions for this Script Properties: Then click OK, and the following rule activation condition will appear: Then Goto to the Fields Tab and add a Rule Field: Select the target Custom Metadata Field and click Ok, then check the “Is derived field”, then “Edit”, then go to the Custom Tab in the Script Properties window and enter the below custom script: <$if #active.dCollectionPath$> <$dprDerivedValue=#active.dCollectionPath$> <$else$> <$dprDerivedValue=#active.xCollectionIDPath$> <$endif$> For more information on the dCollectionPath property, check Section 8.2 Folder Services from the Oracle® Fusion Middleware Services Reference Guide for Oracle Universal Content Management 11g Release 1 (11.1.1) http://docs.oracle.com/cd/E21043_01/doc.1111/e11011/c08_folders002.htm The above rule will keep the Custom Metadata Field updated with the Folder Path information when a document is checked in via the Content Server (CS) Web Interface or the Desktop Integration Suite (DIS).

    Read the article

  • Using Recursive SQL and XML trick to PIVOT(OK, concat) a "Document Folder Structure Relationship" table, works like MySQL GROUP_CONCAT

    - by Kevin Shyr
    I'm in the process of building out a Data Warehouse and encountered this issue along the way.In the environment, there is a table that stores all the folders with the individual level.  For example, if a document is created here:{App Path}\Level 1\Level 2\Level 3\{document}, then the DocumentFolder table would look like this:IDID_ParentFolderName1NULLLevel 121Level 232Level 3To my understanding, the table was built so that:Each proposal can have multiple documents stored at various locationsDifferent users working on the proposal will have different access level to the folder; if one user is assigned access to a folder level, she/he can see all the sub folders and their content.Now we understand from an application point of view why this table was built this way.  But you can quickly see the pain this causes the report writer to show a document link on the report.  I wasn't surprised to find the report query had 5 self outer joins, which is at the mercy of nobody creating a document that is buried 6 levels deep, and not to mention the degradation in performance.With the help of 2 posts (at the end of this post), I was able to come up with this solution:Use recursive SQL to build out the folder pathUse SQL XML trick to concat the strings.Code (a reminder, I built this code in a stored procedure.  If you copy the syntax into a simple query window and execute, you'll get an incorrect syntax error) Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} -- Get all folders and group them by the original DocumentFolderID in PTSDocument table;WITH DocFoldersByDocFolderID(PTSDocumentFolderID_Original, PTSDocumentFolderID_Parent, sDocumentFolder, nLevel)AS (-- first member      SELECT 'PTSDocumentFolderID_Original' = d1.PTSDocumentFolderID            , PTSDocumentFolderID_Parent            , 'sDocumentFolder' = sName            , 'nLevel' = CONVERT(INT, 1000000)      FROM (SELECT DISTINCT PTSDocumentFolderID                  FROM dbo.PTSDocument_DY WITH(READPAST)            ) AS d1            INNER JOIN dbo.PTSDocumentFolder_DY AS df1 WITH(READPAST)                  ON d1.PTSDocumentFolderID = df1.PTSDocumentFolderID      UNION ALL      -- recursive      SELECT ddf1.PTSDocumentFolderID_Original            , df1.PTSDocumentFolderID_Parent            , 'sDocumentFolder' = df1.sName            , 'nLevel' = ddf1.nLevel - 1      FROM dbo.PTSDocumentFolder_DY AS df1 WITH(READPAST)            INNER JOIN DocFoldersByDocFolderID AS ddf1                  ON df1.PTSDocumentFolderID = ddf1.PTSDocumentFolderID_Parent)-- Flatten out folder path, DocFolderSingleByDocFolderID(PTSDocumentFolderID_Original, sDocumentFolder)AS (SELECT dfbdf.PTSDocumentFolderID_Original            , 'sDocumentFolder' = STUFF((SELECT '\' + sDocumentFolder                                         FROM DocFoldersByDocFolderID                                         WHERE (PTSDocumentFolderID_Original = dfbdf.PTSDocumentFolderID_Original)                                         ORDER BY PTSDocumentFolderID_Original, nLevel                                         FOR XML PATH ('')),1,1,'')      FROM DocFoldersByDocFolderID AS dfbdf      GROUP BY dfbdf.PTSDocumentFolderID_Original) And voila, I use the second CTE to join back to my original query (which is now a CTE for Source as we can now use MERGE to do INSERT and UPDATE at the same time).Each part of this solution would not solve the problem by itself because:If I don't use recursion, I cannot build out the path properly.  If I use the XML trick only, then I don't have the originating folder ID info that I need to link to the document.If I don't use the XML trick, then I don't have one row per document to show in the report.I could conceivably do this in the report function, but I'd rather not deal with the beginning or ending backslash and how to attach the document name.PIVOT doesn't do strings and UNPIVOT runs into the same problem as the above.I'm excited that each version of SQL server provides us new tools to solve old problems and/or enables us to solve problems in a more elegant wayThe 2 posts that helped me along:Recursive Queries Using Common Table ExpressionHow to use GROUP BY to concatenate strings in SQL server?

    Read the article

  • Difference between putting variables in header vs putting variables in source

    - by Mohit Deshpande
    Say I declare a header file with a variable: int count; Then in the source file, I want to use count. Do I have to declare it as: extern int count Or can I just use it in my source file? All assuming that I have #include "someheader.h". Or should I just declare it in the source file? What is the difference between putting count in the header file vs the source file? Or does it not matter?

    Read the article

  • Need help manipulating WAV (RIFF) Files at a byte level

    - by Eric
    I'm writing an an application in C# that will record audio files (*.wav) and automatically tag and name them. Wave files are RIFF files (like AVI) which can contain meta data chunks in addition to the waveform data chunks. So now I'm trying to figure out how to read and write the RIFF meta data to and from recorded wave files. I'm using NAudio for recording the files, and asked on their forums as well on SO for way to read and write RIFF tags. While I received a number of good answers, none of the solutions allowed for reading and writing RIFF chunks as easily as I would like. But more importantly I have very little experience dealing with files at a byte level, and think this could be a good opportunity to learn. So now I want to try writing my own class(es) that can read in a RIFF file and allow meta data to be read, and written from the file. I've used streams in C#, but always with the entire stream at once. So now I'm little lost that I have to consider a file byte by byte. Specifically how would I go about removing or inserting bytes to and from the middle of a file? I've tried reading a file through a FileStream into a byte array (byte[]) as shown in the code below. System.IO.FileStream waveFileStream = System.IO.File.OpenRead(@"C:\sound.wav"); byte[] waveBytes = new byte[waveFileStream.Length]; waveFileStream.Read(waveBytes, 0, waveBytes.Length); And I could see through the Visual Studio debugger that the first four byte are the RIFF header of the file. But arrays are a pain to deal with when performing actions that change their size like inserting or removing values. So I was thinking I could then to the byte[] into a List like this. List<byte> list = waveBytes.ToList<byte>(); Which would make any manipulation of the file byte by byte a whole lot easier, but I'm worried I might be missing something like a class in the System.IO name-space that would make all this even easier. Am I on the right track, or is there a better way to do this? I should also mention that I'm not hugely concerned with performance, and would prefer not to deal with pointers or unsafe code blocks like this guy. If it helps at all here is a good article on the RIFF/WAV file format.

    Read the article

< Previous Page | 181 182 183 184 185 186 187 188 189 190 191 192  | Next Page >