signal
I encountered a problem before, one request needed parameters obtained in another request. The first method at this time is to write the second request in the callback of the first request, but in this case, the two requests are very tightly coupled together. This time can be used
to separate them. Let's first look at the 3 related methods:
dispatch_semaphore_t dispatch_semaphore_create(long value)
The method of receiving a long type and returns a dispatch_semaphore_t
semaphore type, the parameter value of the incoming
long dispatch_semaphore_wait(dispatch_semaphore_t dsema, dispatch_time_t timeout)
received signal and a time value, if the amount of the signal is 0, the current thread will be blocked until the signal through the input or greater than 0 time value; If the signal is greater than 0, and will cause the return signal minus 1, the program execution continues to stay
long dispatch_semaphore_signal(dispatch_semaphore_t dsema)
so that the signal is advanced by one and returns
Here are a few ways to use it to keep threads synchronized
let semaphore = DispatchSemaphore.init(value: 0)
var i = 10
DispatchQueue.global().async {
i = 100
semaphore.signal()
}
semaphore.wait()
print("i = \(i)")
Output i 100
If the comment out semaphore.wait()
this line, the i 10
comment : As is to block
add asynchronous parallel to a queue inside, so the program in the main thread jumps over block
directly to semaphore.wait()
the line, because the semaphore
semaphore is 0
, so the current thread will block until the block
execution in the child thread to semaphore.signal()
the semaphore +1
, when semaphore
a semaphore is 1
, so the program to continue. This ensures synchronization between threads.
Lock the thread (at the same time you can control the maximum number of concurrency, the value of value is equal to how many concurrently)
let semaphore = DispatchSemaphore.init(value: 1)
for i in 0..<100 {
DispatchQueue.global().async {
semaphore.wait()
print("i = \(i)")
semaphore.signal()
}
}
Note : When thread 1 executes to semaphore.wait()
this line, semaphore
the semaphore is 1
, so the semaphore -1
becomes 0
and the thread 1
continues to execute; if the line of code in the thread 1
has print
not been executed yet, thread 2 comes Access, semaphore.wait()
because the semaphore is 0
( .wait() OC DISPATCH_TIME_FOREVER
) at this time during execution , thread 2 will always be blocked (at this time thread 2 is in a waiting state), until thread 1 is executed print
and semaphore.signal()
the semaphore is 1, the thread 2
can be unblocked and continue to live Executed under. The above can ensure that only one thread executes print
this line of code at the same time .
Barrier
After waiting for multiple tasks to be executed asynchronously, execute the next task, generally usedbarrier
//
// let queue = DispatchQueue.init(label: "test", qos: .default, attributes: .init(rawValue: 0), autoreleaseFrequency: .workItem, target: nil)
//
let queue = DispatchQueue.init(label: "test", qos: .default, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)
queue.async {//
for _ in 0...3 {
print("......")
}
}
queue.async {//
for _ in 0...3 {
print("++++++");
}
}
queue.async(group: nil, qos: .default, flags: .barrier) {
print("group")
}
queue.async {
print("finish")
}
......
++++++
++++++
++++++
++++++
......
......
......
group
finish
Note : The use of barrier
functions can be done to let the previous tasks complete, and then execute the following tasks, which will block the current thread
Delayed task
let queue = DispatchQueue.init(label: "test", qos: .default, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)
queue.async {//
for _ in 0...3 {
print("......")
}
}
print("0")
queue.asyncAfter(deadline: DispatchTime.now() + 10, execute: {
print(" ")
})
queue.async {//
for _ in 0...3 {
print("++++++");
}
}
Note : 10s
Submit later. And will not hinder the current thread
Group usage (Group)
notify (dependent task)
let queue = DispatchQueue.init(label: "test", qos: .default, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)
let group = DispatchGroup()
queue.async(group: group, qos: .default, flags: [], execute: {
for _ in 0...10 {
print(" 1")
}
})
queue.async(group: group, qos: .default, flags: [], execute: {
for _ in 0...10 {
print(" 2")
}
})
//, myQueue
group.notify(queue: queue) {
print(" ")
}
queue.async {
print(" ")
}
2
1
2
2
2
2
1
2
1
1
1
1
Note : If you use group
+ notify
, you will also wait for the previous task to complete first, and barrier
the difference between and is that it will not hinder the current thread
wait (waiting for a task)
let queue = DispatchQueue.init(label: "test", qos: .default, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)
let group = DispatchGroup()
queue.async(group: group, qos: .default, flags: [], execute: {
for _ in 0...5 {
print(" 1")
}
})
queue.async(group: group, qos: .default, flags: [], execute: {
for _ in 0...5 {
print(" 2")
}
})
// .distantFuture
let result = group.wait(timeout: .now() + 2.0)
switch result {
case .success:
print(" , ")
case .timedOut:
print(" , ")
}
print(" ")
1
2
1
2
1
2
1
2
1
2
1
2
,
Note : If you use wait
+ group
, if it is set timeout = .distantFuture
, it will barrier
be the same as the function, and it will wait for the previous completion, otherwise it will wait for the previous completion or wait for the set time to come, and the next task will be executed, which will block the current scene.
Reference : Dispatch_semaphore learning for iOS GCD from Mengxiaocai
Common methods of Swift 3.0 GCD from Mingyue in front of the bed_