Struct google_api_proto::google::cloud::aiplatform::v1::schema::trainingjob::definition::AutoMlImageClassificationInputs
source · pub struct AutoMlImageClassificationInputs {
pub model_type: i32,
pub base_model_id: String,
pub budget_milli_node_hours: i64,
pub disable_early_stopping: bool,
pub multi_label: bool,
}
Fields§
§model_type: i32
§base_model_id: String
The ID of the base
model. If it is specified, the new model will be
trained based on the base
model. Otherwise, the new model will be
trained from scratch. The base
model must be in the same
Project and Location as the new Model to train, and have the same
modelType.
budget_milli_node_hours: i64
The training budget of creating this model, expressed in milli node
hours i.e. 1,000 value in this field means 1 node hour. The actual
metadata.costMilliNodeHours will be equal or less than this value.
If further model training ceases to provide any improvements, it will
stop without using the full budget and the metadata.successfulStopReason
will be model-converged
.
Note, node_hour = actual_hour * number_of_nodes_involved.
For modelType cloud
(default), the budget must be between 8,000
and 800,000 milli node hours, inclusive. The default value is 192,000
which represents one day in wall time, considering 8 nodes are used.
For model types mobile-tf-low-latency-1
, mobile-tf-versatile-1
,
mobile-tf-high-accuracy-1
, the training budget must be between
1,000 and 100,000 milli node hours, inclusive.
The default value is 24,000 which represents one day in wall time on a
single node that is used.
disable_early_stopping: bool
Use the entire training budget. This disables the early stopping feature. When false the early stopping feature is enabled, which means that AutoML Image Classification might stop training before the entire training budget has been used.
multi_label: bool
If false, a single-label (multi-class) Model will be trained (i.e. assuming that for each image just up to one annotation may be applicable). If true, a multi-label Model will be trained (i.e. assuming that for each image multiple annotations may be applicable).
Implementations§
source§impl AutoMlImageClassificationInputs
impl AutoMlImageClassificationInputs
sourcepub fn model_type(&self) -> ModelType
pub fn model_type(&self) -> ModelType
Returns the enum value of model_type
, or the default if the field is set to an invalid enum value.
sourcepub fn set_model_type(&mut self, value: ModelType)
pub fn set_model_type(&mut self, value: ModelType)
Sets model_type
to the provided enum value.
Trait Implementations§
source§impl Clone for AutoMlImageClassificationInputs
impl Clone for AutoMlImageClassificationInputs
source§fn clone(&self) -> AutoMlImageClassificationInputs
fn clone(&self) -> AutoMlImageClassificationInputs
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moresource§impl Message for AutoMlImageClassificationInputs
impl Message for AutoMlImageClassificationInputs
source§fn encoded_len(&self) -> usize
fn encoded_len(&self) -> usize
source§fn encode(&self, buf: &mut impl BufMut) -> Result<(), EncodeError>where
Self: Sized,
fn encode(&self, buf: &mut impl BufMut) -> Result<(), EncodeError>where
Self: Sized,
source§fn encode_to_vec(&self) -> Vec<u8>where
Self: Sized,
fn encode_to_vec(&self) -> Vec<u8>where
Self: Sized,
source§fn encode_length_delimited(
&self,
buf: &mut impl BufMut,
) -> Result<(), EncodeError>where
Self: Sized,
fn encode_length_delimited(
&self,
buf: &mut impl BufMut,
) -> Result<(), EncodeError>where
Self: Sized,
source§fn encode_length_delimited_to_vec(&self) -> Vec<u8>where
Self: Sized,
fn encode_length_delimited_to_vec(&self) -> Vec<u8>where
Self: Sized,
source§fn decode(buf: impl Buf) -> Result<Self, DecodeError>where
Self: Default,
fn decode(buf: impl Buf) -> Result<Self, DecodeError>where
Self: Default,
source§fn decode_length_delimited(buf: impl Buf) -> Result<Self, DecodeError>where
Self: Default,
fn decode_length_delimited(buf: impl Buf) -> Result<Self, DecodeError>where
Self: Default,
source§fn merge(&mut self, buf: impl Buf) -> Result<(), DecodeError>where
Self: Sized,
fn merge(&mut self, buf: impl Buf) -> Result<(), DecodeError>where
Self: Sized,
self
. Read moresource§fn merge_length_delimited(&mut self, buf: impl Buf) -> Result<(), DecodeError>where
Self: Sized,
fn merge_length_delimited(&mut self, buf: impl Buf) -> Result<(), DecodeError>where
Self: Sized,
self
.source§impl PartialEq for AutoMlImageClassificationInputs
impl PartialEq for AutoMlImageClassificationInputs
source§fn eq(&self, other: &AutoMlImageClassificationInputs) -> bool
fn eq(&self, other: &AutoMlImageClassificationInputs) -> bool
self
and other
values to be equal, and is used
by ==
.impl StructuralPartialEq for AutoMlImageClassificationInputs
Auto Trait Implementations§
impl Freeze for AutoMlImageClassificationInputs
impl RefUnwindSafe for AutoMlImageClassificationInputs
impl Send for AutoMlImageClassificationInputs
impl Sync for AutoMlImageClassificationInputs
impl Unpin for AutoMlImageClassificationInputs
impl UnwindSafe for AutoMlImageClassificationInputs
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
§impl<T> Instrument for T
impl<T> Instrument for T
§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
source§impl<T> IntoRequest<T> for T
impl<T> IntoRequest<T> for T
source§fn into_request(self) -> Request<T>
fn into_request(self) -> Request<T>
T
in a tonic::Request