Search Results

Search found 5490 results on 220 pages for 'shadow folders'.

Page 10/220 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • samplerCubeShadow and texture offset

    - by Irbis
    I use sampler2DShadow when accessing a single shadow map. I create PCF in this way: result += textureProjOffset(ShadowSampler, ShadowCoord, ivec2(-1,-1)); result += textureProjOffset(ShadowSampler, ShadowCoord, ivec2(-1,1)); result += textureProjOffset(ShadowSampler, ShadowCoord, ivec2(1,1)); result += textureProjOffset(ShadowSampler, ShadowCoord, ivec2(1,-1)); result = result * 0.25; For a cube map I use samplerCubeShadow: result = texture(ShadowCubeSampler, vec4(normalize(position), depth)); How to adopt above PCF when accessing a cube map ?

    Read the article

  • Shadowmap first phase and shaders

    - by KaiserJohaan
    I am using OpenGL 3.3 and am tryin to implement shadow mapping using cube maps. I have a framebuffer with a depth attachment and a cube map texture. My question is how to design the shaders for the first pass, when creating the shadowmap. This is my vertex shader: in vec3 position; uniform mat4 lightWVP; void main() { gl_Position = lightWVP * vec4(position, 1.0); } Now, do I even need a fragment shader in this shader pass? from what I understand after reading http://www.opengl.org/wiki/Fragment_Shader, by default gl_FragCoord.z is written to the currently attached depth component (to which my cubemap texture is bound to). Thus I shouldnt even need a fragment shader for this pass and from what I understand, there is no other work to do in the fragment shader other than writing this value. Is this correct?

    Read the article

  • Point Light Soft Shadows

    - by notabene
    How to implement soft shadows for omni directional (point) light. We use typical shadow mapping technique. Depth is rendered to texture cube and addresing is pretty simple then. Just using vector from light to fragments world position. It works perfectly. Until you want soft shadows. In our engine we use PCSS technique for spot lights. But for point light there begins troubles. How to sample in 3D? I developed technique when orthonormal basis is created from a direction and upvector (0,1,0). And then multiply sampling vector (something like this (1.0,i/depthMapSize,j/depthMapSize) with this basis. But this (of course :)) looks pretty bad for vectors near (0,1,0) and (0,-1,0). I will appreciate any help on this.

    Read the article

  • How to access previous VHD versions of system backup?

    - by feklee
    Quote from the 31 Oct 2009 TechNet article "Learn more about system image backup": During the first backup, the backup engine scans the source drive and copies only blocks that contain data into a .vhd file stored on the target, creating a compact view of the source drive. The next time a system image is created, only new and changed data is written to the .vhd file, and old data on the same block is moved out of the VHD and into the shadow copy storage area. Volume Shadow Copy Service is used to compute the changed data between backups, as well as to handle the process of moving the old data out to the shadow copy area on the target. This approach makes the backup fast (since only changed blocks are backed up) and efficient (since data is stored in a compact manner). When restoring the image, blocks will be restored to their original locations on the source disk. If you want to restore from an older backup, the engine reads from the shadow copy area and restores the appropriate blocks. For the last days, a daily system backup of drive C: to drive E: has been scheduled and run by Windows 7 Backup and Restore. Drive C: currently holds 233 GB of data, which fits comfortably on drive E:, a 1 TB drive, with 727 GB of free space remaining. How do I access the previous version of a VHD? I right clicked on files and folders in E:\WindowsImageBackup, and I looked for Previous Versions but always: There are no previous versions available

    Read the article

  • Hyper-V vss-writer not making current copies [migrated]

    - by Martinnj
    I'm using diskshadow to backup live Hyper-V machines on a Windows 2008 server. The backup consists of 3 scripts, the first will create the shadow copies and expose them, the second uses robocopy to copy them to a remote location and the third unexposes the shadow copies again. The first script – the one that runs correctly but fails to do what it's supposed to: # DiskShadow script file to backup VM from a Hyper-V host # First, delete any shadow copies of the drives. System Drives needs to be included. Delete Shadows volume C: Delete Shadows volume D: Delete Shadows volume E: #Ensure that shadow copies will persist after DiskShadow has run set context persistent # make sure the path already exists set verbose on begin backup add volume D: alias VirtualDisk add volume C: alias SystemDrive # verify the "Microsoft Hyper-V VSS Writer" writer will be included in the snapshot # NOTE: The writer GUID is exclusive for this install/machine, must be changed on other machines! writer verify {66841cd4-6ded-4f4b-8f17-fd23f8ddc3de} create end backup # Backup is exposed as drive X: make sure your drive letter X is not in use Expose %VirtualDisk% X: Exit The next is just a robocopy and then an unexpose. Now, when I run the above script, I get no errors from it, except that the "BITS" writer has been excluded because none of its components are included. That's okay because I really only need the Hyper-V writer. Also I double checked the GUID for the writer, it's correct. During the time when the Hyper-V writer becomes active, 2 things will happen on the guest machines: The Debian/Linux machine will go to a saved state and restore when done, all fine. The Windows guests will "creating vss snapshop-sets" or something similar. Then X: gets exposed and I can copy the .vhd files over. The problem is, for some reason, the VHD files I get over seems to be old copies, they miss files, users and updates that are on the actual machines. I also tried putting the machines in a saved sate manually, didn't change the outcome. I hope someone here has an idea of how to solve this.

    Read the article

  • WebCenter Content shared folders for clustering

    - by Kyle Hatlestad
    When configuring a WebCenter Content (WCC) cluster, one of the things which makes it unique from some other WebLogic Server applications is its requirement for a shared file system.  This is actually not any different then 10g and previous versions of UCM when it ran directly on a JVM.  And while it is simple enough to say it needs a shared file system, there are some crucial details in how those directories are configured. And if they aren't followed, you may result in some unwanted behavior. This blog post will go into the details on how exactly the file systems should be split and what options are required. Beyond documents being stored on the file system and/or database and metadata being stored in the database along with other structured data, there is other information being read and written to on the file system.  Information such as user profile preferences, workflow item state information, metadata profiles, and other details are stored in files.  In addition, for certain processes within WCC, each of the nodes needs to know what the other nodes are doing so they don’t step on each other.  WCC keeps track of this through the use of lock files on the file system.  Because of this, each node of the WCC must have access to the same file system just as they have access to the same database. WCC uses its own locking mechanism using files, so it also needs to have access to those files without file attribute caching and without locking being done by the client (node).  If one of the nodes accesses a certain status file and it happens to be cached, that node might attempt to run a process which another node is already working on.  Or if a particular file is locked by one of the node clients, this could interfere with access by another node.  Unfortunately, when disabling file attribute caching on the file share, this can impact performance.  So it is important to only disable caching and locking on the particular folders which require it.  When configuring WebCenter Content after deploying the domain, it asks for 3 different directories: Content Server Instance Folder, Native File Repository Location, and Weblayout Folder.  And starting in PS5, it now asks for the User Profile Folder. Even if you plan on storing the content in the database, you still need to establish a Native File (Vault) and Weblayout directories.  These will be used for handling temporary files, cached files, and files used to deliver the UI. For these directories, the only folder which needs to have the file attribute caching and locking disabled is the ‘Content Server Instance Folder’.  So when establishing this share through NFS or a clustered file system, be sure to specify those options. For instance, if creating the share through NFS, use the ‘noac’ and ‘nolock’ options for the mount options. For the other directories, caching and locking should be enabled to provide best performance to those locations.   These directory path configurations are contained within the <domain dir>\ucm\cs\bin\intradoc.cfg file: #Server System PropertiesIDC_Id=UCM_server1 #Server Directory Variables IdcHomeDir=/u01/fmw/Oracle_ECM1/ucm/idc/ FmwDomainConfigDir=/u01/fmw/user_projects/domains/base_domain/config/fmwconfig/ AppServerJavaHome=/u01/jdk/jdk1.6.0_22/jre/ AppServerJavaUse64Bit=true IntradocDir=/mnt/share_no_cache/base_domain/ucm/cs/ VaultDir=/mnt/share_with_cache/ucm/cs/vault/ WeblayoutDir=/mnt/share_with_cache/ucm/cs/weblayout/ #Server Classpath variables #Additional Variables #NOTE: UserProfilesDir is only available in PS5 – 11.1.1.6.0UserProfilesDir=/mnt/share_with_cache/ucm/cs/data/users/profiles/ In addition to these folder configurations, it’s also recommended to move node-specific folders to local disk to avoid unnecessary traffic to the shared directory.  So on each node, go to <domain dir>\ucm\cs\bin\intradoc.cfg and add these additional configuration entries: VaultTempDir=<domain dir>/ucm/<cs>/vault/~temp/ TraceDirectory=<domain dir>/servers/<UCM_serverN>/logs/EventDirectory=<domain dir>/servers/<UCM_serverN>/logs/event/ And of course, don’t forget the cluster-specific configuration values to add as well.  These can be added through Admin Server -> General Configuration -> Additional Configuration Variables or directly in the <IntradocDir>/config/config.cfg file: ArchiverDoLocks=true DisableSharedCacheChecking=true ServiceAllowRetry=true    (use only with Oracle RAC Database)PublishLockTimeout=300000  (time can vary depending on publishing time and number of nodes) For additional information and details on clustering configuration, I highly recommend reviewing document [1209496.1] on the support site.  In addition, there is a great step-by-step guide on setting up a WebCenter Content cluster [1359930.1].

    Read the article

  • Backup all home folders on usb disk and accessibility

    - by PatrickV
    I am using Ubuntu 12.04 and have multiple family members working on it with there own home folder. I have an USB disk and want to use it to backup my home folders. Trying this, I got some questions. When my disk auto mount, it is not visible for each user. It seams to be visible for the user the time I connect the usb disk. I want to create one folder per home on the usb disk to backup the data to. But when I format the disk in EXT4 or FAT for example it is Read Only. How can I format the disk so it is accessible to every user. Best Regards, Patrick

    Read the article

  • Upgraded to 11.10 lost personal folders, Ubuntu one shows no files

    - by Kevin
    Upgraded to 11.04, from 10.10 system would only come up in terminal mode, but it told me that an additional upgrade was available and did I want to do that. Foolishly thinking that might fix the problem, I said yes. This time it did not make it all the way through the upgrade, when I came back to the computer over an hour later, the screen was filled with an error message "could not open display", had to reboot. Went to recovery mode on reboot to install nvidia module, when I rebooted system came up fine, but without carrying over my personal folders, I have the home folder, but no personal named folder in it. Came to Ubuntu One, but gives error message; File Sync error. (org.freedesktop.DBus.Error.NoReply: Did not receive a reply. Possible causes include: the remote application did not send a reply, the message bus security policy blocked Is the a way around this in order to restore my files? I know my files existed on Ubuntu one as of a few months ago.

    Read the article

  • Encrypt folders out of Home

    - by igi
    Is there a way to encrypt a folder, which is not in /home but even in a different partition, so only my user can access/read contained files? Alternatively, I would like to understand if it is possible to turn a complete ext4 partition into an encrypted volume, which would be mounted at user login. If possible, I would like to make the change without reinstalling Ubuntu. My PC has (mount output): /dev/sda1 on / type ext4 (rw,errors=remount-ro) /dev/sda3 on /home type ext4 (rw) /dev/sda4 on /home/igor/Personale type ext4 (rw) sda4 is the partition containing folders I would like to protect. Thanks!

    Read the article

  • Wiped data, and duplicated folders into files.

    - by Kaustubh P
    Something weird happened today, and I dont know how. Within a folder, all folders have a file by the same name, with a colon appended to it. And all the files from the most inner-most directory in my home, have been dumped to ~, with a size of 0 bytes. I have not executed any scripts or anything. I was just checking out some easter eggs, namely the gegls from outer space and free the fish and was away from the computer and was logged because of the screensaver. I couldnt log-back in with my password, so I just reset the PC, and while booting, the PC went into a drive check. BUT, IIRC, i saw the duplicate "folder files" before I had logged out, so thats not the reason! All the files have a timestamp of 14 Jan. Also, the contents of my eclipse folder have been dumped into ~. Right down to the jars and ini files. HELP!

    Read the article

  • Be careful when Git suppresses bin Folders

    - by Marko Apfel
    Initial situation Often for Visual Studio projects the typical content of a .gitignore file contains this line bin or [B|b]in It is used to avoid that Git tries to track compile outputs as repository relevant data. Problem But keep in mind: this will also suppress bin folders of additional stuff like frameworks and toolsets. For instance Microsoft.SDKs contains a folder named Bin with a lot of programs Simian contains a folder named bin with the program themselves If you store such artifacts also in the repository - according to the principle of a “self containing project” – you could lost the content in the bin folder! Solution Till yet I don’t have a good idea. So I verify for each new added toolset or framework whether it has or has not such a bin folder. If it has, then I must add this bin folder manually to the repository so that Git track it.

    Read the article

  • Places Folders Open in Archive Manager

    - by PansophySR
    Am relatively new to Ubuntu. Currently running 10.10, which was an upgrade from 10.4, which was an upgrade from 9.10, which was a fresh install. Have never compressed anything in Ubuntu, but because I wanted to use the contents of a large folder on a Windows machine, I installed 7zip. Using Places, I navigated to the folder I wanted to compress, right-clicked, chose Compress, selected 7-zip and started the compression. This took many minutes to complete (the final 7z file is over 2.2 GB), but when I copied it to the windows machine, 7-zip handles it fine. However, now when I open Places, the Home Folder, User folder, Documents, Music, Pictures, Videos and Downloads all open the Archive Manager which gives an error message that it "Could not create the archive" because the "Archive type not supported." If I open Places/Computer choose the usr folder from places on the left and right-click/Properties on any of the folders, Music for instance, there is no place to change the "open with." Anyone know how to get this working again.

    Read the article

  • Apple Mail clones Gmail account folders and gets out of sync when tracking unread emails

    - by Petruza
    The Gmail (fc.mm.mp.lh is Gmail also) accounts that I've set up with Mail, automatically created a second folder for each of the accounts, the ones you can see in ALL CAPS at the bottom. I guess this folders represent the web mail accounts, while the folders inside Inbox represent the pop accounts, despite them being the same account. The thing is, as you can see, while the inbox accounts have no unread mails, their "all caps" counterparts show as if they had some unread mails. This is not the normal behavior; when I mark an email as read, it is "read" in both versions of the account, but from time to time, they kind of get "out of sync" and the bottom folders start to show unread emails that were actually read. Have you seen this behavior before? What can I do? I don't use the bottom "folders" but I can't get rid of them anyway. It's just that their unread messages notification annoys me because there aren't actually any unread mails.

    Read the article

  • easy Switching to open folders on a mac

    - by Charles
    How do I easily switch to an open folder on a mac? In windows, which I'm used to using, I can see all my opened folders in my vertical taskbar, all i need to do to switch to another window is click on the folder in the task bar. There's no taskbar in mac, and when i have a lot of folders opened, ie, lots of finder windows, how can I switch between them? The way i'm doing it is, i put expose on an active corner and switch that way. However that's still damn hard, because first i have to bring up expose, and then find my window. The folders are placed in a random position between opened apps, the folders are not in a list, and on a big screen i have to scan the whole screen in order to find the one i want... etc. Is it really this hard just to switch to a different folder on a mac? :(

    Read the article

  • Time Machine doesn't back up some folders/files (that it should)

    - by Eric
    MacBook Pro 17" (Snow Leopard) -- WD 2TB external drive MacBook Pro 13" (Snow Leopard) -- Seagate 1TB external drive I find that Time Machine sometimes doesn't back up new folders (and the files in them). This occurs both when I choose "Back Up Now" from the Time Machine icon in the Menu Bar and in TM's scheduled backups. These are not excluded folders (nor are then in the TM do-not-back-up list); they're perfectly normal folders (at various locations) inside my home folder. The only way to force them to be backed up is to restart the computer (unmounting & mounting the TM external disk does not help). There seems to be a correlation with new folders (i.e., it's more likely to happen that an entire new folder is not backed up), but this may just be observer bias (because those are the folders that I go check to see if they've been backed up). It's not computer dependent (it happens on two different computers). It's not external disk dependent (it happens on two different external disks). It's not time dependent (not restarting for several days does not fix the problem). What does a restart change that these other events don't? I'm considering deleting the /.fseventsd folder (without restarting the computer) to see if that helps. I haven't tried logging out and logging in (without restarting the computer).

    Read the article

  • Picasa "sync to web" folders losing sync when they are moved

    - by GJ.
    I've been using the Picasa "sync to web" feature but recently noticed that several folders, with a lot of synced photos and videos inside them, lost their synced status as soon as I moved them to another location on the disk (not through the Picasa "move folder" command). These folders now still appear with the green arrow indicating that their contents were uploaded, but they lost the blue sync icon they previously had (and are no longer syncing...). If I try to reactivate the "sync to web" option for these folders, Picasa starts re-uploading ALL of their contents. This is absurd.. and would take ages to complete. Is there any way I can somehow get Picasa to recognize these moved folders as the counterpart folder of an existing online folder for sync purposes?

    Read the article

  • Individual pst files for individual outlook 2010 folders

    - by Jack
    I was wondering if it is possible for the following scenario: Suppose I have 5 folders in my Microsoft Outlook 2010. Currently, there are 1 pst file that contain the 5 folders. Each folders contain emails that are grouped according to the folder name. So, can I have 5 pst file, whereby 1 pst file for 1 folder? This is because at some other time, I will create new folder(s) and existing folder I may remove from Outlook (but still able to make a copy of the pst file)

    Read the article

  • Switching to open folders on a Mac

    - by Charles
    How do I easily switch to an open folder on a Mac? In Windows, which I'm used to using, I can see all my opened folders in my vertical taskbar, all I need to do to switch to another window is click on the folder in the task bar. There's no taskbar on a Mac, and when I have a lot of folders opened, ie. lots of Finder windows, how can I switch between them? The way I'm doing it is, I put expose on an active corner and switch that way. However that's still damn hard, because first I have to bring up expose, and then find my window. The folders are placed in a random position between opened apps, the folders are not in a list, and on a big screen I have to scan the whole screen in order to find the one I want... etc. Is it really this hard just to switch to a different folder on a Mac? Is there a taskbar solution on a Mac?

    Read the article

  • Explode all folders

    - by sam
    Ive got a folder with about a hundred sub folders and again each ones of those has between 10 and 20 sub folders, so all in all a pretty large folder tree. Is there a simple way i can explode or export all of the files in the tree to a new folder which will just be one folder contain the files (no folders, no trees). Im running OSX 10.8, although ive also got Parallels so if there is a windows solution i could just run that as its not something i need to do everyday.

    Read the article

  • Download folders from dev server to local drive

    - by Niall Collins
    I am developing a .net web application on a local environment. I have a dev server that the application is installed on. Within the web application on the dev server I have four folders that I dont have locally and that are controlled by another application. In my day to day development I require the four folders on local PC. I would like to automate the process of pulling the folders from the dev server to my local drive, so I can keep thing in sync. Ideally something like this Run file from main folder (be it a bat file, powershell, some sort of job, open to recommendations) Download 4 folders supplied to it. First download bring everything down, from them on only pull the changes Not sure where to start with achieving this but would appreciate any help would with. I know there are apps out there that do something like this but would like to give a go writing something to do this before I resort to using one of them.

    Read the article

  • Windows Server 2008 R2 - Giving local administrator full rights to all folders

    - by ToastMan
    Hi guys, Is there a quick way to give the local administrator full rights to all folders on the C drive? I am having really hard time with that, I try to give it full rights to some folders (user profiles) but I can't even modify the NTFS permissions in some cases, I get "permission denied" Is there some soft of tutorial or script that will just give the administrator full rights on all folders in the C driver? Many thanks for your help! Toast

    Read the article

  • Users removing Administrator from files/folders permissions

    - by user64204
    We're running Windows Server 2003 R2 with Active Directory and are having an issue with network shares whereby users, in an attempt to secure their documents, remove everybody (including the Administrator account) from their files/folders permissions. Since the Administrator no longer has read permission to them, we can't even backup files manually as we get permission errors. One solution that we've found is to change the owner of the files and directories to the Administrator account. We can then change the permissions as we wish. The problem is that this has to be done manually so can't really be applied to an entire share. Another solution that we've tried is to use cacls as follows: cacls d:\path\to\share /C /T /E /G Administrator:F The problem with this is that we're still getting an ACCESS DENIED error on files/folders on which Administrator was removed. Q1: Is there a way to restore at least read access to all files/folders to the Administrator account in a recursive fashion? That would be for the short term. For the long term we're looking for a solution to prevent users from removing Administrator from files/folders permissions. Since we're going to migrate to Windows Server 2008 R2 soon we could wait until we've migrated to implement such solution if need be. Q2: Is there a way to prevent users from removing Administrator from files/folders permissions on Windows Server 2003/2008?

    Read the article

  • Windows 7 Sub-Folders hidden in "Program Files" directory

    - by ron tornambe
    I have Google searched for an hour now and I am confounded. I am using InnoSetup to install a .NET Winforms application that creates directories and folders on the fly. (I have set the folder options to display hidden files, folders...) Although the files that are added to "created" folders appear within the application, they do not show when using Windows Explorer or even when issuing a Dir from a command prompt. I have also modified the application to display (and delete) the contents of these (seemingly imaginary) folders, so I am sure they exist. What am I missing?

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >