Search Results

Search found 11822 results on 473 pages for 'external assembly'.

Page 135/473 | < Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >

  • Preventing 'Reply-All' to Exchange Distribution Groups

    - by Larold
    This is another question in a short series regarding a challenging Exchange project my co-workers have been asked to implement. (I'm helping even though I'm primarily a Unix guy because I volunteered to learn powershell and implement as much of the project in code as I could.) Background: We have been asked to create many distribution groups, say about 500+. These groups will contain two types of members. (Apologies if I get these terms wrong.) One type will be internal AD users, and the other type will be external users that I create Mail Contact entries for. We have been asked to make it so that a "Reply All" is not possible to any messages sent to these groups. I don't believe that is 100% possible to enforce for the following reasons. My question is - is my following reasoning sound? If not, please feel free to educate me on if / how things can properly be implemeneted. Thanks! My reasoning on why it's impossible to prevent 100% of potential reply-all actions: An interal AD user could put the DL in their To: field. They then click the '+' to expand the group. The group contains two external mail contacts. The message is sent to everyone, including those external contacts. External user #1 decides to reply-all, and his mail goes to, at least, external user #2, which wouldn't even involve our Exchange mail relays. An internal AD user could place the DL in their Outlook To: field, then click the '+' button to expand the DL. They then fire off an email to everyone that was in the group. (But the individual addresses are listed in the 'To:' field.) Because we now have a message sent to multiple recipients in the To: field, the addresses have been "exposed", and anyone is free to reply-all, and the messages just get sent to everyone in the To: field. Even if we try to set a Reply-To: field for all of these DLs, external mail clients are not obligated to abide by it, or force users to abide by it. Are my two points above valid? (I admit, they are somewhat similar.) Am I correct to tell our leadership "It is not possible to prevent 100% of the cases where someone will want to Reply-All to these groups UNLESS we train the users sending emails to these groups that the Bcc: field is to be used at all times." I am dying for any insight or parts of the equation I'm not seeing clearly. Thank you!!!

    Read the article

  • Virtual PC 2007 Full Screen Resolution

    - by Swami
    I have a laptop (1440 x 900) and an external monitor (1920 x 1200), and I'm running Virtual PC 2007. Initially, my VPC could not switch to fullscreen mode on my external monitor because the max resolution for VPC was only 1600 x 1200. I installed a hotfix (http://support.microsoft.com/kb/958162) and after the hotfix, I was able to view VPC in fullscreen mode on my external monitor, but now I'm unable to view fullscreen on my laptop. It says "please check that the resolution of the guest is not higher than that of the host". I even tried to decrease the resolution of my Virtual PC to less than than of my laptop, but it always resets it back up. So now after the hotfix, I can view fullscreen mode only when my external monitor is plugged in. Any way I can resolve this?

    Read the article

  • Dell Latitude E6420 dual-boot Ubuntu + Windows 7 Optimus graphics problems

    - by Ryan
    I have a Dell Latitude E6420 laptop with Ubuntu 12.04 alongside Windows 7 (dual-boot) docked in a docking station with 2 DVI outputs. It took me a week of tinkering to get the dual external monitors to work in Ubuntu, and I had to disabled the "Optimus" feature in the BIOS. But now neither external monitor is detected in Windows, and the resolution is also very low. Do you know how I can successfully dual-boot Windows 7 and Ubuntu on this machine using my 2 external DVI monitors? I have an open question here too, trying to resolve this same issue: http://askubuntu.com/questions/146933/dock-with-dual-external-dvi-monitors-with-intel-nvidia-optimus

    Read the article

  • How to make a persistent acknowledgment in Icinga/Nagios?

    - by blerontin
    I am using Icinga (Nagios fork) to also monitor uptime of external hosts and services. Currently when looking at the "Critical" count I find it difficult to decide if an internal service is affected (I should take immediate action) or an external service (I just acknowledge the problem). Is there a way to keep a problem acknowledgement for future down-times of the checked host/service? Is there some way to auto-acknowledge the state change of external hosts/services?

    Read the article

  • What software this log file comes from? [closed]

    - by mickula
    From what software comes this logfile? Please specify full name. Internal IP Threshold FlowsDiff 40 flows/s, Diff: 73 flows/s Sum 26.962 flows/300s (89 flows/s), 32.162.000 packets/300s (107.206 packets/s), 1,198 GByte/300s (32 MBit/s) External 87.98.238.221, 26.958 flows/300s (89 flows/s), 32.156.000 packets/300s (107.186 packets/s), 1,198 GByte/300s (32 MBit/s) External 89.230.69.49, 2 flows/300s (0 flows/s), 2.000 packets/300s (6 packets/s), 0,000 GByte/300s (0 MBit/s) External 89.231.190.149, 1 flows/300s (0 flows/s), 3.000 packets/300s (10 packets/s), 0,000 GByte/300s (0 MBit/s) External 89.239.101.20, 1 flows/300s (0 flows/s), 1.000 packets/300s (3 packets/s), 0,000 GByte/300s (0 MBit/s)

    Read the article

  • Separate brightness/power-saving settings for multiple monitors

    - by MariusMatutiae
    At home, I connect my laptop to an external monitor. Under Linux X server, it works perfectly. Occasionally I watch a movie on the external monitor. When I do, I would like to fold the laptop shut, and just relax. This however triggers my settings for dimming the screen after 10 minutes of inactivity, which in turn extends to the external monitor. I know I could solve the problem by setting dimming to "Never", but this is a waste, causes the laptop to overheat, and is bad for the normal laptop use. How can I set the power saving/brightness specs for the external monitor, independently of the onboard one? I run Debian or Arch, under KDE.

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • Cross domain LDAP

    - by Adam
    For a system we are developing we have 2 domains an internal and an external domain with bi directional trust between them. However the servers are only able to connect to their own DC's. We have an application server on the internal domain which needs to use an LDAP query to gather a list of users from a group on the external domain. How do i go about writing an LDAP query that asks one DC to go ask another DC for a list of users. I tried querying the internal DC with the same LDAP query I would use if it could hit the external DC directly but this does not work. When i use Softerra LDAP Administraor I can view the full hierarchy of the interal domain but despite the trust relationship between domains i am unable to see any of the external doamin. Any suggestions or help would be greatly appreciated

    Read the article

  • Duplication of Windows 7 Backup

    - by Steven Pickles
    I use the built in backup utility for Windows 7 because it's automated and flexible enough to allow me to schedule a daily shadow copy backup of particular files and folders directly to a separate internal RAID 0 array (2 x 1TB). It's also lightweight and stays out of the way. For off-site backup purposes, each week I copy the contents of the internal backup from the RAID 0 array to an external 1 TB drive. I then store move this drive to a different building. The copy from the internal backup to the external backup typically works like this: mount and erase contents of external drive highlight "file" on internal drive, hit CTRL+C CTRL+V on root directory of external drive Is there a better way to synchronize? Microsoft's SyncToy application does a pitiful job, and often leaves the folders not truly synchronized... which completely defeats the ability to use the backup's restore feature.

    Read the article

  • ubuntu server refusing connections via port forwarding

    - by Matt
    Getting some really weird behavior from our Ubuntu server... it's behind a Verizon router firewall with port forwarding (port 8080 to port 80 on the server), and we've been having issues accessing it via this external IP. From within the network, it appears to respond normally (I can access it via web browser and SSH), but refuses connections through port forwarding (using our static external IP). The strangest thing is that it actually responds to external port-forwarded connections right after being restarted, but quickly lapses back into this pattern of refusing external connections. I'm a bit of a server newbie (I'm actually a programmer in a small startup that just lost their server ops guy, urgh) so this is all trial by fire for me. Does anyone have any advice on what could be going wrong here? Any help would be appreciated, thanks.

    Read the article

  • Configuring Windows Server Backup Destination Drive Sets

    - by Nicholas
    Is it possible to set up the standard backup system in SBS 2011 (or Server 2008 R2) to use an internal drive as a destination as well as external drives? Before you say yes, from my tests and from what I've read on the web, backups with internal drives included as a destination always seems to prefer the internal drive over connected external ones. (Regardless of what drive might be marked as 'active'). So no data ever gets written to a plugged in external drive. In my opinion external drives should always have priority over internal drives or including them is pointless.

    Read the article

  • Open ports broken from internal network

    - by ksvi
    Quick summary: Forwarded port works from the outside world, but from the internal network using the external IP the connection is refused. This is a simplified situation to make the explanation easier: I have a computer that is running a service on port 12345. This computer has an internal IP 192.168.1.100 and is connected directly to a modem/router which has internal IP 192.168.1.1 and external (public, static) IP 1.2.3.4. (The router is TP-LINK TD-w8960N) I have set up port forwarding (virtual server) at port 12345 to go to port 12345 at 192.168.1.100. If I run telnet 192.168.1.100 12345 from the same computer everything works. But running telnet 1.2.3.4 12345 says connection refused. If I do this on another computer (on the same internal network, connected to the router) the same thing happens. This would seem like the port forwarding is not working. However... If I run a online port checking service on my external IP and the service port it says the port is open and I can see the remote server connecting and immediately closing connection. And using another computer that is connected to the internet using a mobile connection I can also use telnet 1.2.3.4 12345 and I get a working connection. So the port forwarding seems to be working, however using external IP from the internal network doesn't. I have no idea what can be causing this, since another setup very much like this (different router) works for me. I can access a service running on a server from inside the network both through the internal and external IP.

    Read the article

  • Fixed or consistent mount location for a USB hard drive

    - by Journeyman Geek
    I store my music on an external hard drive, and play them with foobar2k. However, the drive letter changes, which usually means I need to rebuild a fairly large playlist every so often. I'm wondering if there's a way to reserve a drive letter for a specific external device (or type of device) by device ID or volume name, or if I'm better off using a NTFS mount point, and re-mounting the drive to a folder each time. I'm using either a Windows XP or 7 system, and the external drive is NTFS.

    Read the article

  • Formatting a former RAID 0 drive through USB

    - by EXC
    I'll try to be as specific as possible here: I was using two Hitachi 2.5" 500 gb HDDs in my Gateway P-7805u laptop in a RAID 0 configuration. The array was causing the laptop to run extremely hot, however, so I removed them and deleted the RAID array through Intel Matrix HDD manager. I did a clean install of Windows 7 on the original 320 gb HDD that came with the laptop. I never did format the original RAID array HDDs before taking them out of the computer. Now, I am attempting to format the Hitachi 500 gb RAID array HDDs externally through a USB external enclosure. The external HDD drivers install on my clean install OS, but when I go into 'My Computer' there is no external drive available. I cannot format in CMD Prompt because my computer will not designate a drive letter to the external HDD. The drivers install and the HDD is recognized as a Hitachi external drive, but nothing seems to show up in my computer window. I need to know if there is a way to format these drives to NTFS externally.

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • Grub File Stolen ( WINDOWS and OpenSus )

    - by NESIRSUSEJ
    I have a problem with my computer. I installed OpenSus on my external, and I still have Windows on my HDD. OpenSus took the Grub file and placed it on my external, so now I have to open OpenSus to open Windows. I never got a Windows CD when I bought my computer ( I live in South Africa :) ).... I want to install Ubuntu 12.04 on my external, but then I will have to format my external in which case I will lose the Grub file causing me to lose Windows, which I can't afford to do... yet. Does anyone have an idea what I can do?

    Read the article

  • taglib link errors

    - by Vihaan Verma
    I m using taglib for one of my projects . The Debug/Release library is build using MSVC 10. On compiling the code with the library in taglib/taglib/Release some linker error are thrown . id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: class TagLib::AudioPropertie s * __cdecl TagLib::FileRef::audioProperties(void)const " (__imp_?audioProperties@FileRef@TagLib@@QEBAPEAVAudioProp erties@2@XZ) referenced in function "struct MetaData __cdecl ID3::getMetaDataOfFile(class std::basic_string<char,st ruct std::char_traits<char>,class std::allocator<char> >)" (?getMetaDataOfFile@ID3@@YA?AUMetaData@@V?$basic_string@ DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: virtual __cdecl TagLib::Stri ng::~String(void)" (__imp_??1String@TagLib@@UEAA@XZ) referenced in function "struct MetaData __cdecl ID3::getMetaDa taOfFile(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >)" (?getMetaDataOfF ile@ID3@@YA?AUMetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: class std::basic_string<char ,struct std::char_traits<char>,class std::allocator<char> > __cdecl TagLib::String::to8Bit(bool)const " (__imp_?to8 Bit@String@TagLib@@QEBA?AV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@_N@Z) referenced in function "struct MetaData __cdecl ID3::getMetaDataOfFile(class std::basic_string<char,struct std::char_traits<char>,class s td::allocator<char> >)" (?getMetaDataOfFile@ID3@@YA?AUMetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator @D@2@@std@@@Z) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: virtual __cdecl TagLib::File Ref::~FileRef(void)" (__imp_??1FileRef@TagLib@@UEAA@XZ) referenced in function "struct MetaData __cdecl ID3::getMet aDataOfFile(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >)" (?getMetaData OfFile@ID3@@YA?AUMetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: class TagLib::Tag * __cdecl TagLib::FileRef::tag(void)const " (__imp_?tag@FileRef@TagLib@@QEBAPEAVTag@2@XZ) referenced in function "struct Meta Data __cdecl ID3::getMetaDataOfFile(class std::basic_string<char,struct std::char_traits<char>,class std::allocator <char> >)" (?getMetaDataOfFile@ID3@@YA?AUMetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z ) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: bool __cdecl TagLib::FileRef ::isNull(void)const " (__imp_?isNull@FileRef@TagLib@@QEBA_NXZ) referenced in function "struct MetaData __cdecl ID3: :getMetaDataOfFile(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >)" (?getM etaDataOfFile@ID3@@YA?AUMetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: __cdecl TagLib::FileRef::Fil eRef(class TagLib::FileName,bool,enum TagLib::AudioProperties::ReadStyle)" (__imp_??0FileRef@TagLib@@QEAA@VFileName @1@_NW4ReadStyle@AudioProperties@1@@Z) referenced in function "struct MetaData __cdecl ID3::getMetaDataOfFile(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >)" (?getMetaDataOfFile@ID3@@YA?AU MetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z) id3.cpp.1.o : error LNK2019: unresolved external symbol "__declspec(dllimport) public: __cdecl TagLib::FileName::Fi leName(char const *)" (__imp_??0FileName@TagLib@@QEAA@PEBD@Z) referenced in function "struct MetaData __cdecl ID3:: getMetaDataOfFile(class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >)" (?getMe taDataOfFile@ID3@@YA?AUMetaData@@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@Z) I m only including tag.lib from taglib/taglib/Release folder . Is there some other library I m missing out?

    Read the article

  • GPLv2 - Multiple AI chess engines to bypass GPL

    - by Dogbert
    I have gone through a number of GPL-related questions, the most recent being this one: http://stackoverflow.com/questions/3248823/legal-question-about-the-gpl-license-net-dlls/3249001#3249001 I'm trying to see how this would work, so bear with me. I have a simple GUI interface for a game of Chess. It essentially can send/receive commands to/from an external chess engine (ie: Tong, Fruit, etc). The application/GUI is similar in nature to XBoard ( http://www.gnu.org/software/xboard/ ), but was independently designed. After going through a number of threads on this topic, it seems that the FSF considers dynamically linking against a GPLv2 library as a derivative work, and that by doing so, the GPLv2 extends to my proprietary code, and I must release the source to my entire project. Other legal precedents indicate the opposite, and that dynamic linking doesn't cause the "viral" effect of the GPL to propagate to my proprietary code. Since there is no official consensus that can give a "hard-and-fast" answer to the dynamic linking question, would this be an acceptable alternative: I build my chess GUI so that it sends/receives the chess engine AI logic as text commands from an external interface library that I write The interface library I wrote itself is then released under the GPL The interface library is only used to communicate via a generic text pipe to external command-line chess engines The chess engine itself would be built as a command-line utility rather than as a library of any sort, and just sends strings in the Universal Chess Interface of Chess Engine Communication Protocol ( http://en.wikipedia.org/wiki/Chess_Engine_Communication_Protocol ) format. The one "gotcha" is that the interface library should not be specific to one single GPL'ed chess engine, otherwise the entire GUI would be "entirely dependent" on it. So, I just make my interface library so that it is able to connect to any command-line chess engine that uses a specific format, rather than just one unique engine. I could then include pre-built command-line-app versions of any of the chess engines I'm using. Would that sort of approach allow me to do the following: NOT release the source for my UI Release the source of the interface library I built (if necessary) Use one or more chess engines and bundle them as external command-line utilities that ship with a binary version of my UI Thank you.

    Read the article

  • Can't install Ubuntu on Asus Eee 1015pem

    - by Peter
    I'm having trouble to install Ubuntu. I use a ASUS Eee 1015pem netbook. Recently, I my netbook got wet. I had it inside my backback and all my things got wet. The netbook boots up fine but it will not load the OS. I downloaded ubuntu onto my external hard drive and changed the settings in my Bios to boot from a removable device. Nothing happens. When I plug in my external hard drive I'm not able to get to the boot icon. I have to unplug it the external hard drive. Set my boot settings I tried both Removable and CD-Rom. Than I plug my external drive back in and nothing happens on either settings. My Asus never came with a recovery disk and suppose to have a build in recovery by pressing F9 in the Bios. Also I need to disable Boot Booster in Bios and Boot Booster is not even an opition in Bios. My friend told me try installing Ubuntu but now I'm having no luck with Ubuntu. Any suggestions?

    Read the article

  • How can I make multiple displays work on my Asus UX32VD?

    - by oKtosiTe
    Original title: Why do I have two trash icons in the Unity Launcher? Whether I run Ubuntu as a live-USB or install it, I always have two trash bins on the Unity Launcher. Both work, and both open the same location. This seems a bit redundant; what could be done about it? Update: Turning auto-hide on made it obvious that I have multiple Launchers showing. With auto-hide off, they simply overlap, making it look like there's a double trash icon, but with auto-hide enabled, I can display one Launcher (and therefore one trash icon) at a time. Still, two are running simultaneously. Second update: This problem appears to be caused by the way Ubuntu handles multiple displays on my Asus UX32VD Ultrabook. Somehow, the laptop display cannot be used while my external display is connected. It is shown in the Displays list, but remains black no matter how I configure it. The external display runs at 1920x1200, the laptop monitor should run at 1920x1080. It therefore becomes obvious that the Launcher that's supposed to run on the laptop display, is actually displayed on the external monitor. Using nomodeset as a kernel parameter as indicated here makes the laptop display inaccessible altogether, detecting the external monitor as the laptop display and making resolutions other than 1920x1200 inaccessible. That is not an option.

    Read the article

  • Depending on fixed version of a library and ignore its updates

    - by Moataz Elmasry
    I was talking to a technical boss yesterday. Its about a project in C++ that depends on opencv and he wanted to include a specific opencv version into the svn and keep using this version ignoring any updates which I disagreed with.We had a heated discussion about that. His arguments: Everything has to be delivered into one package and we can't ask the client to install external libraries. We depend on a fixed version so that new updates of opencv won't screw our code. We can't guarantee that within a version update, ex from 3.2.buildx to 3.2.buildy. Buildy the function signatures won't change. My arguments: True everything has to be delivered to the client as one package,but that's what build scripts are for. They download the external libraries and create a bundle. Within updates of the same version 3.2.buildx to 3.2.buildy its impossible that a signature change, unless it is a really crappy framework, which isn't the case with opencv. We deprive ourselves from new updates and features of that library. If there's a bug in the version we took, and even if there's a bug fix later, we won't be able to get that fix. Its simply ineffiecient and anti design to depend on a certain version/build of an external library as it makes our project difficult in the future to adopt to new changes. So I'd like to know what you guys think. Does it really make sense to include a specific version of external library in our svn and keep using it ignoring all updates?

    Read the article

< Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >