Search Results

Search found 78189 results on 3128 pages for 'file management'.

Page 140/3128 | < Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >

  • From the Tips Box: Pin Any File to the Windows 7 Taskbar

    - by Jason Fitzpatrick
    Every week we dip into the tip box and share the tips you send in. This week we’re highlighting a great tip and the accompanying tutorial video that shows you how to pin any file to the Windows 7 taskbar. Robert Jasinski writes in with a clever way to pin any file you want to the task bar. By default if you drag a text document to the taskbar it will pin it to the Notepad executable—the same thing happens with any other file that has an association with an executable. What if you want to pin that specific text file to the taskbar and not to the executable (or any other file for that matter)? Robert shares his method:  What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • Mac Management and Security

    - by Bart Silverstrim
    I was going through some literature on managing OS X laptops and asked someone some questions about usage scenarios when using the MacBooks. I asked someone more knowledgeable than I about whether it was possible for my Mac to be taken over if I were visiting another site for a conference or if I went on a wifi network at a local coffee house with policies from an OS X Server with workgroup manager (either legit for the site or someone running a version of OS X Server on hardware they have hidden somewhere on the network), which apparently could be set up to do things like limit my access to Finder or impose other neat whiz-bang management features. He said that it is indeed possible for it to happen as it would be assigned via the DHCP server and the OS X server would assume my Mac is a guest and could hand out restrictions and apparently my Mac will happily accept them without notifying me or giving me an option, unlike Windows which I believe would need to be joined to a domain before it becomes "managed" by Active Directory. So my question is as network admins and sysadmins with users traveling with MacBooks, is there a way to reasonably protect your users from having their machines hijacked without resorting to just turning off networking all the time? Or isn't this much of a security hazard? What threat does this pose to the road warriors in your businesses?

    Read the article

  • BitLocker with Windows DPAPI Encryption Key Management

    - by bigmac
    We have a need to enforce resting encryption on an iSCSI LUN that is accessible from within a Hyper-V virtual machine. We have implementing a working solution using BitLocker, using Windows Server 2012 on a Hyper-V Virtual Server which has iSCSI access to a LUN on our SAN. We were able to successfully do this by using the "floppy disk key storage" hack as defined in THIS POST. However, this method seems "hokey" to me. In my continued research, I found out that the Amazon Corporate IT team published a WHITEPAPER that outlined exactly what I was looking for in a more elegant solution, without the "floppy disk hack". On page 7 of this white paper, they state that they implemented Windows DPAPI Encryption Key Management to securely manage their BitLocker keys. This is exactly what I am looking to do, but they stated that they had to write a script to do this, yet they don't provide the script or even any pointers on how to create one. Does anyone have details on how to create a "script in conjunction with a service and a key-store file protected by the server’s machine account DPAPI key" (as they state in the whitepaper) to manage and auto-unlock BitLocker volumes? Any advice is appreciated.

    Read the article

  • Disabling CPU management

    - by Tiffany Walker
    If I add the following processor.max_cstate=0 to the kernel command line for boot up, does that disable all CPU power management and throttling? I also found: http://www.experts-exchange.com/OS/Linux/Administration/A_3492-Avoiding-CPU-speed-scaling-in-modern-Linux-distributions-Running-CPU-at-full-speed-Tips.html The link talks of Change CPU governor from 'ondemand' to 'performance' for all CPUs/cores and disabling kondemand from kernel. Server is for web hosting UPDATES: 2.6.32-379.1.1.lve1.1.7.6.el6.x86_64 #1 SMP Sat Aug 4 09:56:37 EDT 2012 x86_64 x86_64 x86_64 GNU/Linux . # dmidecode 2.11 SMBIOS 2.6 present. 74 structures occupying 2878 bytes. Table at 0x0009F000. Handle 0x0000, DMI type 0, 24 bytes BIOS Information Vendor: American Megatrends Inc. Version: 1.0c Release Date: 05/27/2010 Address: 0xF0000 Runtime Size: 64 kB ROM Size: 4096 kB Characteristics: ISA is supported PCI is supported PNP is supported BIOS is upgradeable BIOS shadowing is allowed ESCD support is available Boot from CD is supported Selectable boot is supported BIOS ROM is socketed EDD is supported 5.25"/1.2 MB floppy services are supported (int 13h) 3.5"/720 kB floppy services are supported (int 13h) 3.5"/2.88 MB floppy services are supported (int 13h) Print screen service is supported (int 5h) 8042 keyboard services are supported (int 9h) Serial services are supported (int 14h) Printer services are supported (int 17h) CGA/mono video services are supported (int 10h) ACPI is supported USB legacy is supported LS-120 boot is supported ATAPI Zip drive boot is supported BIOS boot specification is supported Targeted content distribution is supported BIOS Revision: 8.16 Handle 0x0001, DMI type 1, 27 bytes System Information Manufacturer: Supermicro Product Name: X8SIE Version: 0123456789 Serial Number: 0123456789 UUID: 49434D53-0200-9033-2500-33902500D52C Wake-up Type: Power Switch SKU Number: To Be Filled By O.E.M. Family: To Be Filled By O.E.M. Handle 0x0002, DMI type 2, 15 bytes Base Board Information Manufacturer: Supermicro Product Name: X8SIE Version: 0123456789 Serial Number: VM11S61561 Asset Tag: To Be Filled By O.E.M. Features: Board is a hosting board Board is replaceable Location In Chassis: To Be Filled By O.E.M. Chassis Handle: 0x0003 Type: Motherboard Contained Object Handles: 0 Handle 0x0003, DMI type 3, 21 bytes Chassis Information Manufacturer: Supermicro Type: Sealed-case PC Lock: Not Present Version: 0123456789 Serial Number: 0123456789 Asset Tag: To Be Filled By O.E.M. Boot-up State: Safe Power Supply State: Safe Thermal State: Safe Security Status: None OEM Information: 0x00000000 Height: Unspecified Number Of Power Cords: 1 Contained Elements: 0

    Read the article

  • SLK opens SCORM package as a ZIP file

    - by Cherie Riesberg
    Symptom: After installing SharePoint Learning Kit successfully, (http://www.codeplex.com/SLK), everything works except that the SCORM package (a ZIP extension) is opening as a ZIP file instead of a course. You get the normal ZIP message "Do you want to open or save this file?" Problem: The package is zipped at the upper folder level and does not create a manifest that allows SharePoint to recognize it as a SCORM file instead of a ZIP file. Solution: Add the contents of the course to the ZIP, not the outer (uppermost) folder.  This creates a ZIP file that SharePoint can recognize as a SCORM package.

    Read the article

  • Server 2008 Disk Management Hangs

    - by Payson Welch
    So I have looked everywhere for the solution to this and have tried many things. There is one post on SE related to this and I tried the suggested answer but I am still having problems. We have a server running Server 2008 R2 Standard x64. I need to increase the space of C: since the free space is running very low. However when I open Server Manager and try to go to the "Disk Management" snap-in it just hangs. There is a status message on the bottom of the window that says "Connecting to Virtual Disk Service...". Here are the steps I have taken: Ran sfc /scannow Set all of the drives to be dirty and rebooted so that they would be scanned Executed chkdsk /f /r /b /v on all of the drives. Checked for Windows updates (none). Verified that the services "Virtual Disk", "RPC Procedures" and "Plug and Play" are all running. One symptom is that the service "Virtual Disk" does not cleanly shut down. I receive a message about the process being unexpectedly terminated when I try to stop or restart it. Also I cannot find anything relevant in the event logs. Any ideas or suggestions?

    Read the article

  • Mac Management Without Permission and Security

    - by Bart Silverstrim
    I was going through some literature on managing OS X laptops and asked someone some questions about usage scenarios when using the MacBooks. I asked someone more knowledgeable than I about whether it was possible for my Mac to be taken over if I were visiting another site for a conference or if I went on a wifi network at a local coffee house with policies from an OS X Server with workgroup manager (either legit for the site or someone running a version of OS X Server on hardware they have hidden somewhere on the network), which apparently could be set up to do things like limit my access to Finder or impose other neat whiz-bang management features. He said that it is indeed possible for it to happen as it would be assigned via the DHCP server and the OS X server would assume my Mac is a guest and could hand out restrictions and apparently my Mac will happily accept them without notifying me or giving me an option, unlike Windows which I believe would need to be joined to a domain before it becomes "managed" by Active Directory. So my question is as network admins and sysadmins with users traveling with MacBooks, is there a way to reasonably protect your users from having their machines hijacked without resorting to just turning off networking all the time? Or isn't this much of a security hazard? What threat does this pose to the road warriors in your businesses?

    Read the article

  • Only 192.168.0.3 can request, but anyone can request /public/file.html

    - by mattalexx
    I have the following virtual host on my development server: <VirtualHost *:80> ServerName example.com DocumentRoot /srv/web/example.com/pub <Directory /srv/web/example.com/pub> Order Deny,Allow Deny from all Allow from 192.168.0.3 </Directory> </VirtualHost> The Allow from 192.168.0.3 part is to only allow requests from my workstation machine. I want to tweak this to allow anyone to request a certain URL: http://example.com/public/file.html How do I change this to allow /public/file.html requests to get through from anyone? Note: /public/file.html doesn't actually exist as a file on the server. I redirect all incoming requests through a single index file using mod_rewrite.

    Read the article

  • OS X Mavericks Won't Connect To Ubuntu Server (Netatalk, Avahi)

    - by Andy Ibanez
    I'm really sorry for posting this. I know it may have been asked a thousand times. I have googled like crazy and I'm on the verge of desperation here. Basically, I followed this guide: http://motionsoundfx.com/2012/05/ubuntu-vnc-afp-macosx/ To create a small personal file server. When I installed it, I was able to connect to it just fine, I connected with my Ubuntu username and password and I was able to see the home directory. But later, I had to restart the file server so I could prepare a couple of other hard drives to put in. When the server restarted, I tried to connect to it, but I got an error message on my Mac: "The version of the server you're trying to connect to is not supported. Please contact your system administrator to solve this problem." Again, I have googled like crazy for this, and everybody says it is a problem with OS X Lion and up (assuming it affects Mavericks too). I have tried all the fixes mentioned for Lion and Mountain Lion and I haven't had any luck. That's the reason I'm posting this here: I suspect the problem is with my Ubuntu server. This happened after I restarted the server. Before restarting the server, I just put in my credentials and saw my home directory. Something when I restarted the server must have been messed up. I have found some other solutions, including to use "SHX2" in the conf file, but it hasn't worked for me. I ask for your help to solve this issue. Also please understand I'm completely illiterate when it comes to Linux. This is a nice chance to me to learn the OS so please give me detailed steps to do things if you deem it necessary. Thank you! I'm using Ubuntu Server 13.10 (the latest one as of today).

    Read the article

  • Changing the sequencing strategy for File/Ftp Adapter

    - by [email protected]
    The File/Ftp Adapter allows the user to configure the outbound write to use a sequence number. For example, if I choose address-data_%SEQ%.txt as the FileNamingConvention, then all my files would be generated as address-data_1.txt, address-data_2.txt,...and so on. But, where does this sequence number come from? The answer lies in the "control directory" for the particular adapter project(or scenario). In general, for every project that use the File or Ftp Adapter, a unique directory is created for book keeping purposes. And since this control directory is required to be unique, the adapter uses a digest to make sure that no two control directories are the same. For example, for my FlatStructure sample, the control information for my project would go under FMW_HOME/user_projects/domains/soainfra/fileftp/controlFiles/[DIGEST]/outbound where the value of DIGEST would differ from one project to another. If you look under this directory, you will see a file control_ob.properties and this is where the sequence number is maintained. Please note that the sequence number is maintained in binary form and you hence you might need a hex editor to view its content. You will also see another zero byte file, SEQ_nnn, but, ignore that for now. We'll get to it some other time. For now, please remember that this extra file is maintained as a backup. One of the challenges faced by the adapter runtime is to guard all writes to the control files so no two threads inadverently try to update them at the same time. And, it does so with the help of a "Mutex". For now, please remember that the mutex comes in different flavors: In-memory DB-based Coherence-based User-defined Again, we will talk about these mutexes some other time. Please note that there might be scenarios, particularly under heavy load, where the mutex might become a bottleneck. The adapter, however,  allows you to change the configuration so that the adapter sequence value comes from a database sequence or a stored procedure and in such situation, the mutex is acually by-passed and thereby resulting in better throughputs. In later releases, the behavior of the adapter would be defaulted to use a db-sequence.  The simplest way to achieve this is by switching your JNDI for the outbound JCA file to use "eis/HAFileAdapter" as shown   But, what does this do? Internally, the adapter runtime creates a sequence on the oracle database. For example, if you do a "select * from user_sequences" in your soa-infra schema, you will see a new sequence being created with name as SEQ_<GUID>__ where the GUID will differ from one project to another. However, if you want to use your own sequence, then it would require you to add a new property to your JCA file called SequenceName as shown below. Please note that you will need to create this sequence on your soainfra schema beforehand.     But, what if we use DB2 or MSSQL Server as the dehydration support? DB2 supports sequences natively but MSSQL Server does not. So, the adapter runtime uses a natively generated sequence for DB2, but, for MSSQL server, the adapter relies on a stored procedure that ships with the product. If you wish to achieve the same result for SOA Suite running DB2 as the dehydration store, simply change your connection factory JNDI name in the JCA file to eis/HAFileAdapterDB2 and for MSSQL, please use eis/HAFileAdapterMSSQL. And, if you wish to use a stored procedure other than the one that ships with the product, you will need to rely on binding properties to override the adapter behavior. Particularly, you will need to instruct the adapter that you wish to use a stored procedure as shown:       Please note that if you're using the File/Ftp Adapter in Append mode, then the adapter runtime degrades the mutex to use pessimistic locks as we don't want writers from different nodes to append to the same file at the same time.                    

    Read the article

  • Using a :default for file names on include templates in SMARTY 3 [closed]

    - by Yohan Leafheart
    Hello everyone, Although I don't think the question was as good as it could be, let me try to explain better here. I have a site using SMARTY 3 as the template system. I have a template structure similar to the below one: /templates/place1/inner_a.tpl /templates/place1/inner_b.tpl /templates/place2/inner_b.tpl /templates/place2/inner_c.tpl /templates/default/inner_a.tpl /templates/default/inner_b.tpl /templates/default/inner_c.tpl These are getting included on the parent template using {include file="{$temp_folder}/{$inner_template}"} So far great. What I wanted to do is having a default for, in the case that the file "{$temp_folder}/{$inner_template}" does not exists, it uses the equivalent file at "default/{$inner_template}". i.e. If I do {include file="place1/inner_c.tpl"}, since that file does not exists it in fact includes "default/inner_c.tpl" Is it possible?

    Read the article

  • How to attach WAR file in email from jenkins

    - by birdy
    We have a case where a developer needs to access the last successfully built WAR file from jenkins. However, they can't access the jenkins server. I'd like to configure jenkins such that on every successful build, jenkins sends the WAR file to this user. I've installed the ext-email plugin and it seems to be working fine. Emails are being received along with the build.log. However, the WAR file isn't being received. The WAR file lives on this path in the server: /var/lib/jenkins/workspace/Ourproject/dist/our.war So I configured it under Post build actions like this: The problem is that emails are sent but the WAR file isn't being attached. Do I need to do something else?

    Read the article

  • Removal of libsound2 file causes graphics loss

    - by Sajid Ahmad
    I was trying to install skype on ubuntu 12.10 desktop. but it was giving some error related to libsound2:i386 file. to overcome this problem i removed file libsound2 thinking that will install it later. but it removes all the graphics from my system. after removal of the file system started to give error that system is running in low graphics mode. I tried to install libsound2 file again but couldn't. After it i have upgraded the release of my ubuntu version using command do-release-upgrade think that it will install the missing file. But still there are no graphics on the system. I am using Dell Inspiron 15 . Please help me to tell that how can i get the graphics of system back.

    Read the article

  • SSRS2008R2 report times out, but the underlying query executes in the Management Studio

    - by Matthew Belk
    A customer of mine recently moved servers and the new server has SQL2008R2. His old server was SQL2005. The new server has substantially better CPU, RAM, and disk performance than the old, but several reports time out while executing. When I run the underlying query in the SQL Management Studio, the query executes in sub-second time. The exact error message returned via the Report Manager UI is: An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError) Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. It must be noted that this database is not just analytical; it's also fairly transactional, although the transaction volume is not exceptionally high. What can I do to improve the performance of the SSRS query engine? Are there settings in the data source I can adjust, or in the SSRS config files?

    Read the article

  • How to Search File Contents in Windows Server 2008 R2

    - by ybbest
    By default, windows search only search by File name. To configure windows search to search by contents you need configure the following: You need to make sure Windows Search Services feature is activated.(Check this article for details) Then, configure Windows Search by Open file explorer: Press Alt button –> go to tools –> Folder options –> search tab –> Here select, “Always search file names and content(this might take several minutes)” Press okay. Now your searches will work for file content like the good old days of XP. Another way to search the contents in file without Search configuration is to Type “contents:” in the Windows Explorer search box followed by the word, searches text files. This is a search filter which seems to be undocumented?

    Read the article

  • Default file permissions for php user www-data

    - by John Isaacks
    I have a php installed on my ubuntu machine. The web root is /var/www I set the permissions for this folder like so: sudo chown -R ftpuser:www-data /var/www ftpuser is the user I set up so I can ftp to /var/www from another machine on the network. www-data is the user php uses. I double checked using whoami from php. Whenever I ftp upload a new file to the machine the group has no permissions to the file. So when I try to access it in my browser via machine-name/new-file.php I am told permission denied and I have to go and chmod the new file. I am wondering if there is a way I can default the www-data user/group to have access permissions to new files so I don't have to keep chmod every new file?

    Read the article

  • Microsoft SQL Server Management Studio causing system freeze

    - by CRoshanLG
    I'm experiencing very slow response from MSSMS and it causes other applications to slow down. Specially Skype crashes after few seconds from opening MSSMS, showing an error called "Disk I/O Error". I'm regularly using few applications (Sublime text, MS Word, Firefox, Outlook, Skype and one or two other apps) simultaneously. The system works fine when MSSMS is not in use! But as soon as MSSMS is opened, all the apps start to freeze (MSSMS also responds very slow). This problem has been there for about a week now (I haven't installed any apps or haven't made any changes to the system during that time). -- System Specifications -- Processor: Core i3 (3.1 GHz) RAM: 4 GB OS: Windows 7 Professional (64 bit) Free space in C drive: ~ 100 GB MS SQL Server 2008 R2 Microsoft SQL Server Management Studio version - 10.50.1600.1 I've tried to find a reason for this but there are no helpful information in the web! There are some solutions suggested (in forums and in Skype Support pages) for the Skypes' "Disk I/O Error", all of which I tried but does not solve the problem. Has anyone faced the same senario? (and hopefully) knows a solution? Systems Log I don't have much knowledge in interpreting the System Log, but I think the Critical and Warnings are not helpful. But there are lots of Error logs which might be useful. In source Kernal-General there are few similar errors saying "An I/O operation initiated by the Registry failed unrecoverably.The Registry could not flush hive (file): <some file>" In source atapi also there are few similar errors -- "The driver detected a controller error on \Device\Ide\IdePort0." (all errors has occurred in 'IdePort0') In Application Error, there are several errors logged, and following is the latest one. Both the Errors which has occurred today is similar (to this one). As it is from Ssms.exe, I guess this is relevant to the cause of problem. But as I said above I can't understand what it means!

    Read the article

  • knife on Windows inconsistently reads ~\.ssh\knife.rb on Management Workstation

    - by gWaldo
    I am implementing a new instance of (Open-source v10.12) Chef in an existing environment. Currently the environment is mostly Windows, but more Linux is being introduced. I have used Chef in a previous gig, however that was a *nix-only environment. Because this is a primarily-Windows environment, my main workstation is Windows 7 (x64), and I use Powershell as my main terminal. I created a ~\.chef directory, populated with a knife.rb and my client.pem file. When I run knife client list from ~, I get the expected results. I keep my work in Dropbox just in case my laptop should fail or be stolen. When I run knife client list from the repo directory (C:\Users\waldo\Dropbox_company\projects\chef`), I get ERROR: Your private key could not be loaded from C:/home/waldo/.chef/waldog.pem Check your configuration file and ensure that your private key is readable (Note that the path is incorrect) This is the progression as I walk up the tree towards my ~ running knife client list: C:\Users\waldo\Dropbox\_company\projects\ => Above error C:\Users\waldo\Dropbox\_company\ => Above error C:\Users\waldo\Dropbox\ => It works! (Expected results) C:\Users\waldo\ => Expected results C:\Users\waldo\Documents\ => Expected Results C:\Users\waldo\Documents\GitHub => Expected Results C:\Users\waldo\Documents\GitHub\aProject\ => Expected Results What. The. Eff! Now, I know that I can add -c path\to\knife.rb, but that's a HUGE PITA. Question is: Why is knife inconsistently reading my ~\.chef\knife.rb, and how can I get around that without incurring carpal tunnel?

    Read the article

  • Increase the size of a memory mapped file

    - by sandun dhammika
    I am maintaning a memory mapped file to store my tree like datastructure. When I'm updating the datastructure ,I got this problem. The file is limited on it's size and can't be too long or too small. I have a methods like void mapfile_insert_record(RECORD* /* record*/); void mapfile_modify_record(RECORD* /* record*/); Both operations could lead to exceed the space which is free on memory file. How do I overcome this? What strategy I should use. calculate whether it requires to exceed the file as a pre-condition on both methods. Dynamically exceed it , for a example manage a timer and constantly polling file for it's free avaliable size and then automatically extend it. Any ideas or patterns to overcome this problem?

    Read the article

  • Change Windows Authentication user for Sql Server Management Studio

    - by Asmor
    We're using Sql Server 2005 with Windows Authentication setup. So normally, when you log in using e.g. Sql Server Management Studio, it forces you to log in at MACHINE_NAME\Username. Anyways, on this one particular computer, the person said they had to make a new account called User01 to do something and showed me where she'd created it under security in the "master" system database. And so now when she logs in, it's listed as MACHINE_NAME\User01 (not the actual Windows user name). It's still set to Windows Authentication, though, and I'm unable to change the login name. Now here's where the real problem comes in... I didn't realize that she was being logged in under this user name at the time, and I disabled it to see what would happen. Now I can't log into the server under her account. I created a new account in Windows called test, and as expected SSMS had the username as MACHINE_NAME\test, and I was able to log in fine. However, the area where the User01 account was listed is not visible to me as far as I can tell and so I can't reenable it. I also tried running the following query: alter login User01 ENABLE And got this error: Msg 15151, Level 16, State 1, Line 1 Cannot alter the login 'User01', because it does not exist or you do not have permission. So in a nutshell, ideally I'd like to reenable User01 somehow, just to get things back to where they used to be. Failing that, how can I force SSMS to log in using the Windows account name as it should be, rather than trying to use User01?

    Read the article

  • How to create a deb package that installs a series of files

    - by fossfreedom
    I would like to create a brand new deb package to install series of files. If at all possible, I would like to untar the folder containing these files as part of the installation into a known folder location. Failing that, some knowledge how to package the source folders and files would be very useful. Question is - is this possible and if so - how? Lets give an example: ~/mypluginfolder/ contains the files x, y, a subfolder called abc and inside that another file called z. I want to tar this folder: tar -cvf myfiles.tar ~/mypluginfolder I presume my debian package would look like myfiles.tar.gz myfiles+ppafoss_0.1-1/ myfiles.tar DEBIAN changelog, compat, control, install, rules source Is it possible to somehow untar myfiles.tar to a known folder location for example /usr/share/rhythmbox/plugins/ Thus the final result would be: /usr/share/rhythmbox/plugins/mypluginfolder /usr/share/rhythmbox/plugins/mypluginfolder\x /usr/share/rhythmbox/plugins/mypluginfolder\y /usr/share/rhythmbox/plugins/mypluginfolder\abc\z If - presuming launchpad needs source, advice is sought as to where I should drop the source folders and files into the deb package structure. This will eventually will become a series of individual launchpad PPA packages. What I prefer (but may not be able to achieve...) is to keep my packaging to a minimum - create a series of packages from a template and adjust the bare minimum (changelog etc + the tar file/file & folder structure).

    Read the article

  • How can I add the version of a file to the file name with Tortoise-SVN?

    - by Eric Belair
    I would like to start giving unique names to "cache-able" files - i.e. *.css and *.js - in order to prevent caching, without requiring changes to the web-server settings (as is currently done in IIS). For instance, let's I have a JavaScript file called global.js. Going forward I would like it to have the name global.123.js when revision 123 is checked in. This would also require the following: The previous version of the file - perhaps it was global.115.js - is removed when the file is deployed. All references to the file are updated with the new file name How do I go about doing this? What concerns do I need to consider?

    Read the article

  • Replacing LF, NEL line endings in text file with CR+LF

    - by Tomas Lycken
    I have a text file with a strange character encoding that I'd like to convert to standard UTF-8. I have managed to get part of the way: $ file myfile.txt myfile.txt: Non-ISO extended-ASCII text, with LF, NEL line endings $ iconv -f ascii -t utf-8 myfile.txt > myfile.txt.utf8 $ file myfile.txt.utf8 myfile.txt.utf8: UTF-8 Unicode text, with LF, NEL line endings ## edit myfile.txt.utf8 using nano, to fix failed character conversions (mostly åäö) $ file myfile.txt.utf8 myfile.txt.utf8: UTF-8 Unicode text, with LF, NEL line endings However, I can't figure out how to convert the line endings. How do I do to replace LF+NEL with CR+LF (or whatever is the standard)? When I'm done, I'd like to see the following: $ file myfile.txt myfile.txt: UTF-8 Unicode text

    Read the article

  • SharePoint, Exchange and Incoming Emails Without Directory Management Services

    - by Nariman
    Trying to keep this as simple as possible. We've already created the email accounts that we need (e.g. account[1-20]@domain.com) on Exchange/AD. We'd like to now enable incoming emails on SharePoint 2007 lists corresponding to these accounts. My thinking is we don’t need to configure Directory Management Services [2] – the architecture will be simpler without it and the application doesn’t require these services. However, we still need to route messages from Exchange to either local SMTP services (via the connector described in the articles below) or by user-specific drop-folder settings (if permitted by Exchange). So the question is: can we instruct Exchange to use a drop folder just for accounts account[1-20]@domain.com? or do we need to change the accounts to account[1-20]@sharepointsmtp.domain.com and re-route those message to the local SMTP service that will drop them on disk? I've read the material below. [1] - http://www.combined-knowledge.com/Downloads/2007/How%20to%20configure%20Email%20Enabled%20Lists%20in%20Moss2007%20RTM%20using%20Exchange%202007.pdf http://social.msdn.microsoft.com/Forums/en/sharepointdevelopment/thread/91e0c3d2-afe6-469d-b1bc-6ae7a9aa287e http://gj80blogtech.blogspot.com/2009/12/configure-incoming-email-setting-in.html http://www.jasonslater.co.uk/2007/08/10/configuring-incoming-mail-on-moss-2007-and-exchange-2007/ http://technet.microsoft.com/en-us/library/cc262947%28office.12%29.aspx http://technet.microsoft.com/en-us/library/cc263260%28office.12%29.aspx [2] – http://graycloud.com/sharepoint/incoming-mail-configuration-what-permissions-are-require-t39483.html

    Read the article

  • Framework Folders and Duplicate File Names

    - by Kevin Smith
    I have been working with Framework folders a little bit in the past few days and found one unexpected behavior that is different from Contribution Folders (Folders_g). If you try and check a file into a Framework Folder that already exists in the folder it will allow it and rename the file for you. In Folders_g this would have generated an error and prevented you from checking in the file. A quick check of the Framework Folder configuration settings in the Application Administrator’s Guide for Content Server does not show a configuration parameter to control this. I'm still thinking about this and not sure if I like this new behavior or not. I guess from a user perspective this more closely aligns Framework Folders to how Windows handle duplicate file names, but if you are migrating from Folders_g and expect a duplicate file name to be rejected, this might cause you some problems.

    Read the article

< Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >